Mar 17 11:11:35 crc systemd[1]: Starting Kubernetes Kubelet... Mar 17 11:11:35 crc restorecon[4698]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:35 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 11:11:36 crc restorecon[4698]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 17 11:11:36 crc restorecon[4698]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 17 11:11:37 crc kubenswrapper[4742]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 11:11:37 crc kubenswrapper[4742]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 17 11:11:37 crc kubenswrapper[4742]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 11:11:37 crc kubenswrapper[4742]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 11:11:37 crc kubenswrapper[4742]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 11:11:37 crc kubenswrapper[4742]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.962273 4742 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973381 4742 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973415 4742 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973420 4742 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973423 4742 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973427 4742 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973431 4742 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973434 4742 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973438 4742 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973444 4742 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973449 4742 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973453 4742 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973457 4742 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973461 4742 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973465 4742 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973469 4742 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973473 4742 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973477 4742 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973481 4742 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973486 4742 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973490 4742 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973494 4742 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973497 4742 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973501 4742 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973505 4742 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973509 4742 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973513 4742 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973516 4742 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973519 4742 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973523 4742 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973527 4742 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973539 4742 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973543 4742 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973546 4742 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973550 4742 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973554 4742 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973558 4742 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973562 4742 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973565 4742 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973568 4742 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973573 4742 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973579 4742 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973583 4742 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973587 4742 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973593 4742 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973598 4742 feature_gate.go:330] unrecognized feature gate: Example Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973604 4742 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973609 4742 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973613 4742 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973617 4742 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973622 4742 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973628 4742 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973632 4742 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973636 4742 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973639 4742 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973643 4742 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973647 4742 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973650 4742 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973654 4742 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973658 4742 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973662 4742 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973666 4742 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973670 4742 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973675 4742 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973680 4742 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973684 4742 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973689 4742 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973693 4742 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973697 4742 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973701 4742 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973708 4742 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.973713 4742 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973817 4742 flags.go:64] FLAG: --address="0.0.0.0" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973827 4742 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973876 4742 flags.go:64] FLAG: --anonymous-auth="true" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973883 4742 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973888 4742 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973892 4742 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973898 4742 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973922 4742 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973927 4742 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973931 4742 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973936 4742 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973943 4742 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973948 4742 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973952 4742 flags.go:64] FLAG: --cgroup-root="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973956 4742 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973960 4742 flags.go:64] FLAG: --client-ca-file="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973964 4742 flags.go:64] FLAG: --cloud-config="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973968 4742 flags.go:64] FLAG: --cloud-provider="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973972 4742 flags.go:64] FLAG: --cluster-dns="[]" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973978 4742 flags.go:64] FLAG: --cluster-domain="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973982 4742 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973986 4742 flags.go:64] FLAG: --config-dir="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973990 4742 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.973995 4742 flags.go:64] FLAG: --container-log-max-files="5" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974002 4742 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974006 4742 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974010 4742 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974015 4742 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974020 4742 flags.go:64] FLAG: --contention-profiling="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974024 4742 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974028 4742 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974033 4742 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974037 4742 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974048 4742 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974052 4742 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974056 4742 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974060 4742 flags.go:64] FLAG: --enable-load-reader="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974065 4742 flags.go:64] FLAG: --enable-server="true" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974069 4742 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974074 4742 flags.go:64] FLAG: --event-burst="100" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974079 4742 flags.go:64] FLAG: --event-qps="50" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974083 4742 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974088 4742 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974092 4742 flags.go:64] FLAG: --eviction-hard="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974097 4742 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974101 4742 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974105 4742 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974111 4742 flags.go:64] FLAG: --eviction-soft="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974116 4742 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974120 4742 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974124 4742 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974128 4742 flags.go:64] FLAG: --experimental-mounter-path="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974132 4742 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974136 4742 flags.go:64] FLAG: --fail-swap-on="true" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974140 4742 flags.go:64] FLAG: --feature-gates="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974145 4742 flags.go:64] FLAG: --file-check-frequency="20s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974149 4742 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974154 4742 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974158 4742 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974163 4742 flags.go:64] FLAG: --healthz-port="10248" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974167 4742 flags.go:64] FLAG: --help="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974171 4742 flags.go:64] FLAG: --hostname-override="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974175 4742 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974179 4742 flags.go:64] FLAG: --http-check-frequency="20s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974183 4742 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974187 4742 flags.go:64] FLAG: --image-credential-provider-config="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974191 4742 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974195 4742 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974199 4742 flags.go:64] FLAG: --image-service-endpoint="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974204 4742 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974208 4742 flags.go:64] FLAG: --kube-api-burst="100" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974212 4742 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974216 4742 flags.go:64] FLAG: --kube-api-qps="50" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974220 4742 flags.go:64] FLAG: --kube-reserved="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974225 4742 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974229 4742 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974233 4742 flags.go:64] FLAG: --kubelet-cgroups="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974237 4742 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974241 4742 flags.go:64] FLAG: --lock-file="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974245 4742 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974251 4742 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974256 4742 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974263 4742 flags.go:64] FLAG: --log-json-split-stream="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974270 4742 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974275 4742 flags.go:64] FLAG: --log-text-split-stream="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974280 4742 flags.go:64] FLAG: --logging-format="text" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974286 4742 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974292 4742 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974297 4742 flags.go:64] FLAG: --manifest-url="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974302 4742 flags.go:64] FLAG: --manifest-url-header="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974309 4742 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974313 4742 flags.go:64] FLAG: --max-open-files="1000000" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974318 4742 flags.go:64] FLAG: --max-pods="110" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974322 4742 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974327 4742 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974331 4742 flags.go:64] FLAG: --memory-manager-policy="None" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974335 4742 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974339 4742 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974343 4742 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974347 4742 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974358 4742 flags.go:64] FLAG: --node-status-max-images="50" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974362 4742 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974366 4742 flags.go:64] FLAG: --oom-score-adj="-999" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974370 4742 flags.go:64] FLAG: --pod-cidr="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974374 4742 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974381 4742 flags.go:64] FLAG: --pod-manifest-path="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974385 4742 flags.go:64] FLAG: --pod-max-pids="-1" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974390 4742 flags.go:64] FLAG: --pods-per-core="0" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974394 4742 flags.go:64] FLAG: --port="10250" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974398 4742 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974402 4742 flags.go:64] FLAG: --provider-id="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974406 4742 flags.go:64] FLAG: --qos-reserved="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974410 4742 flags.go:64] FLAG: --read-only-port="10255" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974415 4742 flags.go:64] FLAG: --register-node="true" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974419 4742 flags.go:64] FLAG: --register-schedulable="true" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974422 4742 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974430 4742 flags.go:64] FLAG: --registry-burst="10" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974435 4742 flags.go:64] FLAG: --registry-qps="5" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974440 4742 flags.go:64] FLAG: --reserved-cpus="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974446 4742 flags.go:64] FLAG: --reserved-memory="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974453 4742 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974458 4742 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974463 4742 flags.go:64] FLAG: --rotate-certificates="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974468 4742 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974473 4742 flags.go:64] FLAG: --runonce="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974478 4742 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974485 4742 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974490 4742 flags.go:64] FLAG: --seccomp-default="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974494 4742 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974498 4742 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974502 4742 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974506 4742 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974511 4742 flags.go:64] FLAG: --storage-driver-password="root" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974515 4742 flags.go:64] FLAG: --storage-driver-secure="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974521 4742 flags.go:64] FLAG: --storage-driver-table="stats" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974526 4742 flags.go:64] FLAG: --storage-driver-user="root" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974532 4742 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974537 4742 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974542 4742 flags.go:64] FLAG: --system-cgroups="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974548 4742 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974557 4742 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974562 4742 flags.go:64] FLAG: --tls-cert-file="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974569 4742 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974576 4742 flags.go:64] FLAG: --tls-min-version="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974580 4742 flags.go:64] FLAG: --tls-private-key-file="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974584 4742 flags.go:64] FLAG: --topology-manager-policy="none" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974588 4742 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974593 4742 flags.go:64] FLAG: --topology-manager-scope="container" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974598 4742 flags.go:64] FLAG: --v="2" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974605 4742 flags.go:64] FLAG: --version="false" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974612 4742 flags.go:64] FLAG: --vmodule="" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974618 4742 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.974623 4742 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974747 4742 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974755 4742 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974763 4742 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974768 4742 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974773 4742 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974777 4742 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974782 4742 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974787 4742 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974792 4742 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974797 4742 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974802 4742 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974807 4742 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974812 4742 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974817 4742 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974822 4742 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974827 4742 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974832 4742 feature_gate.go:330] unrecognized feature gate: Example Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974836 4742 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974842 4742 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974847 4742 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974851 4742 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974855 4742 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974860 4742 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974865 4742 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974872 4742 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974878 4742 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974883 4742 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974888 4742 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974894 4742 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974900 4742 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974921 4742 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974926 4742 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974931 4742 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974936 4742 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974940 4742 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974945 4742 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974949 4742 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974954 4742 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974960 4742 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974966 4742 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974972 4742 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974977 4742 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974982 4742 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974986 4742 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974991 4742 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.974995 4742 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975000 4742 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975004 4742 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975009 4742 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975013 4742 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975019 4742 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975025 4742 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975030 4742 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975034 4742 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975039 4742 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975043 4742 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975048 4742 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975053 4742 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975057 4742 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975061 4742 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975066 4742 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975070 4742 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975076 4742 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975080 4742 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975084 4742 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975089 4742 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975093 4742 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975097 4742 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975101 4742 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975106 4742 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.975110 4742 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.975118 4742 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.994430 4742 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.994484 4742 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994564 4742 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994574 4742 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994581 4742 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994590 4742 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994595 4742 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994599 4742 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994603 4742 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994607 4742 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994611 4742 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994615 4742 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994619 4742 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994623 4742 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994627 4742 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994631 4742 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994635 4742 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994639 4742 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994644 4742 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994649 4742 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994653 4742 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994658 4742 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994665 4742 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994670 4742 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994674 4742 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994679 4742 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994683 4742 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994689 4742 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994694 4742 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994700 4742 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994705 4742 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994710 4742 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994716 4742 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994720 4742 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994725 4742 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994729 4742 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994741 4742 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994745 4742 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994749 4742 feature_gate.go:330] unrecognized feature gate: Example Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994753 4742 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994757 4742 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994761 4742 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994765 4742 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994769 4742 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994773 4742 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994777 4742 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994781 4742 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994786 4742 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994790 4742 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994794 4742 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994798 4742 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994802 4742 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994806 4742 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994811 4742 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994818 4742 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994824 4742 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994828 4742 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994833 4742 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994837 4742 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994841 4742 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994845 4742 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994849 4742 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994853 4742 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994857 4742 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994860 4742 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994864 4742 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994868 4742 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994871 4742 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994875 4742 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994879 4742 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994883 4742 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994886 4742 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.994891 4742 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.994898 4742 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995058 4742 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995065 4742 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995070 4742 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995074 4742 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995079 4742 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995082 4742 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995086 4742 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995090 4742 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995094 4742 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995098 4742 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995102 4742 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995108 4742 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995113 4742 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995117 4742 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995122 4742 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995125 4742 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995129 4742 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995134 4742 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995138 4742 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995143 4742 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995148 4742 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995154 4742 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995159 4742 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995164 4742 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995169 4742 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995174 4742 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995179 4742 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995183 4742 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995187 4742 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995191 4742 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995196 4742 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995200 4742 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995204 4742 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995208 4742 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995212 4742 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995217 4742 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995221 4742 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995224 4742 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995229 4742 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995234 4742 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995238 4742 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995242 4742 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995246 4742 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995250 4742 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995256 4742 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995261 4742 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995266 4742 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995271 4742 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995275 4742 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995280 4742 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995284 4742 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995288 4742 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995292 4742 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995296 4742 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995301 4742 feature_gate.go:330] unrecognized feature gate: Example Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995304 4742 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995308 4742 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995312 4742 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995316 4742 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995320 4742 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995324 4742 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995328 4742 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995332 4742 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995335 4742 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995339 4742 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995343 4742 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995347 4742 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995351 4742 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995355 4742 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995359 4742 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 17 11:11:37 crc kubenswrapper[4742]: W0317 11:11:37.995364 4742 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.995370 4742 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 17 11:11:37 crc kubenswrapper[4742]: I0317 11:11:37.995547 4742 server.go:940] "Client rotation is on, will bootstrap in background" Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.015872 4742 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.020065 4742 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.020164 4742 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.035448 4742 server.go:997] "Starting client certificate rotation" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.035500 4742 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.035711 4742 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.156409 4742 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.164484 4742 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.172673 4742 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.221534 4742 log.go:25] "Validated CRI v1 runtime API" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.431560 4742 log.go:25] "Validated CRI v1 image API" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.441874 4742 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.456935 4742 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-17-11-07-18-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.456970 4742 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.483252 4742 manager.go:217] Machine: {Timestamp:2026-03-17 11:11:38.469558062 +0000 UTC m=+1.595685830 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:6693cb74-dd53-4aae-b4e6-7786830660f7 BootID:a949f061-2bf4-4376-98c3-0527ac24d2e9 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:8d:16:f1 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:8d:16:f1 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:7a:07:2d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:73:7b:ec Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:5a:1b:45 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:89:07:58 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4e:bc:5c:07:3d:ed Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:e6:12:45:61:c4:69 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.483709 4742 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.484064 4742 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.498936 4742 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.499136 4742 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.499166 4742 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.522779 4742 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.522821 4742 container_manager_linux.go:303] "Creating device plugin manager" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.523432 4742 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.523456 4742 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.523657 4742 state_mem.go:36] "Initialized new in-memory state store" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.523825 4742 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.534738 4742 kubelet.go:418] "Attempting to sync node with API server" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.534759 4742 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.534774 4742 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.534786 4742 kubelet.go:324] "Adding apiserver pod source" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.534797 4742 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.539998 4742 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.542237 4742 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 17 11:11:38 crc kubenswrapper[4742]: W0317 11:11:38.551899 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.552034 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:38 crc kubenswrapper[4742]: W0317 11:11:38.551867 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.552084 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.561307 4742 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.570231 4742 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.570281 4742 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.570290 4742 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.570296 4742 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.570308 4742 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.570315 4742 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.570322 4742 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.570334 4742 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.570344 4742 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.570352 4742 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.570364 4742 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.570371 4742 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.571318 4742 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.571981 4742 server.go:1280] "Started kubelet" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.572159 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:38 crc systemd[1]: Started Kubernetes Kubelet. Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.575825 4742 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.584328 4742 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.584359 4742 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.584381 4742 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.586941 4742 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.587042 4742 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.587386 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.587419 4742 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.587456 4742 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.589873 4742 factory.go:55] Registering systemd factory Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.589930 4742 factory.go:221] Registration of the systemd container factory successfully Mar 17 11:11:38 crc kubenswrapper[4742]: W0317 11:11:38.590396 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.590539 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.591720 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="200ms" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.601208 4742 factory.go:153] Registering CRI-O factory Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.601247 4742 factory.go:221] Registration of the crio container factory successfully Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.601349 4742 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.601378 4742 factory.go:103] Registering Raw factory Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.601397 4742 manager.go:1196] Started watching for new ooms in manager Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.602170 4742 manager.go:319] Starting recovery of all containers Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.604276 4742 server.go:460] "Adding debug handlers to kubelet server" Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.601143 4742 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.230:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189d9c7f3e2fbd13 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.571939091 +0000 UTC m=+1.698066849,LastTimestamp:2026-03-17 11:11:38.571939091 +0000 UTC m=+1.698066849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609272 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609310 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609322 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609333 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609343 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609380 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609391 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609400 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609447 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609459 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609469 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609482 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609499 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609519 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609529 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609545 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609555 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.609564 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612412 4742 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612441 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612455 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612489 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612504 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612516 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612526 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612537 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612549 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612563 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612574 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612609 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612620 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612634 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612645 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612658 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612668 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612679 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612694 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612705 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612717 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612729 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612741 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612751 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612766 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612777 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612788 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612804 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612816 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612834 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612868 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612879 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612894 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612928 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612940 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612958 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612974 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612986 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.612999 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613012 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613027 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613042 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613053 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613066 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613075 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613087 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613096 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613106 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613115 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613124 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613134 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613144 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613159 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613194 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613210 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613222 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613240 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613254 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613263 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613273 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613284 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613295 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613305 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613320 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613331 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613340 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613349 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613358 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613367 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613377 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613391 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613405 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613414 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613423 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613432 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613446 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613459 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613471 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613485 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613495 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613503 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613513 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613522 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613536 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613547 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613558 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613568 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613584 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613597 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613608 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613621 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613637 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613647 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613658 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613672 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613689 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613699 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613713 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613727 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613737 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613751 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613760 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613770 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613779 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613788 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613797 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613807 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613818 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613827 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613838 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613847 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613857 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613867 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613879 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613896 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613922 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613931 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613940 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613950 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613959 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613969 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613979 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613988 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.613999 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614009 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614019 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614028 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614038 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614047 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614056 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614065 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614074 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614084 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614095 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614106 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614118 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614129 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614140 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614150 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614160 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614171 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614183 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614194 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614204 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614215 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614226 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614235 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614244 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614253 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614262 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614272 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614281 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614290 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614298 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614307 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614317 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614326 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614336 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614345 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614384 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614393 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614405 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614416 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614427 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614438 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614449 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614461 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614469 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614478 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614487 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614495 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614505 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614515 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614525 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614534 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614545 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614558 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614568 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614578 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614587 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614596 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614606 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614616 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614626 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614636 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614646 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614656 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614665 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614674 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614686 4742 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614706 4742 reconstruct.go:97] "Volume reconstruction finished" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.614714 4742 reconciler.go:26] "Reconciler: start to sync state" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.617846 4742 manager.go:324] Recovery completed Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.634403 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.635894 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.635959 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.635972 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.636397 4742 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.636412 4742 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.636428 4742 state_mem.go:36] "Initialized new in-memory state store" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.659868 4742 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.661570 4742 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.661606 4742 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.661629 4742 kubelet.go:2335] "Starting kubelet main sync loop" Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.661742 4742 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 11:11:38 crc kubenswrapper[4742]: W0317 11:11:38.662359 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.662464 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.688234 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.762168 4742 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.774552 4742 policy_none.go:49] "None policy: Start" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.775894 4742 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.775949 4742 state_mem.go:35] "Initializing new in-memory state store" Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.788337 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.793252 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="400ms" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.851717 4742 manager.go:334] "Starting Device Plugin manager" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.851931 4742 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.851951 4742 server.go:79] "Starting device plugin registration server" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.852428 4742 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.852444 4742 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.852681 4742 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.852809 4742 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.852822 4742 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.858435 4742 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.952876 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.954857 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.954926 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.954941 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.954974 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:11:38 crc kubenswrapper[4742]: E0317 11:11:38.955629 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.962646 4742 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.962733 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.963879 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.963948 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.963966 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.964241 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.964443 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.964481 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.965838 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.965879 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.965889 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.966581 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.966613 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.966624 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.966753 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.966871 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.966914 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.968023 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.968044 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.968051 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.968137 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.968172 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.968221 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.968484 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.968511 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.968492 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.969432 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.969458 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.969467 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.970188 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.970210 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.970218 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.970361 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.970480 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.970540 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.971089 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.971108 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.971117 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.971260 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.971275 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.971283 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.971291 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.971299 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.971980 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.971997 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:38 crc kubenswrapper[4742]: I0317 11:11:38.972008 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.019883 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020036 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020074 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020106 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020161 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020196 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020232 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020296 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020330 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020347 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020394 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020427 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020483 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020557 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.020589 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.121591 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.121716 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.121792 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.121856 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.121891 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.121967 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122001 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122068 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122135 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122183 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122237 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122265 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.121755 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122296 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122286 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.121724 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122318 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122313 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122177 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122385 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122414 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122467 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122450 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122595 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122534 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122712 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122899 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122859 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122900 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.122972 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.156118 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.157609 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.157664 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.157683 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.157722 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:11:39 crc kubenswrapper[4742]: E0317 11:11:39.158284 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Mar 17 11:11:39 crc kubenswrapper[4742]: E0317 11:11:39.194528 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="800ms" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.292033 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.297732 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.313626 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.339792 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.345981 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:39 crc kubenswrapper[4742]: W0317 11:11:39.375576 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:39 crc kubenswrapper[4742]: E0317 11:11:39.375695 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:39 crc kubenswrapper[4742]: W0317 11:11:39.457220 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-aa3e6fc52266e208e2378c60a7f39053dec4ec70571a3821ef4771240e8e9c80 WatchSource:0}: Error finding container aa3e6fc52266e208e2378c60a7f39053dec4ec70571a3821ef4771240e8e9c80: Status 404 returned error can't find the container with id aa3e6fc52266e208e2378c60a7f39053dec4ec70571a3821ef4771240e8e9c80 Mar 17 11:11:39 crc kubenswrapper[4742]: W0317 11:11:39.458372 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-2b3c1a5a4c60ec43bddea0226745577eb777ac3b2d2755eea17067555768df51 WatchSource:0}: Error finding container 2b3c1a5a4c60ec43bddea0226745577eb777ac3b2d2755eea17067555768df51: Status 404 returned error can't find the container with id 2b3c1a5a4c60ec43bddea0226745577eb777ac3b2d2755eea17067555768df51 Mar 17 11:11:39 crc kubenswrapper[4742]: W0317 11:11:39.459624 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-9adba2b7cfbce7eb87cf5e1c0ee6e8ae11a92cea863dba276ab22005b684778e WatchSource:0}: Error finding container 9adba2b7cfbce7eb87cf5e1c0ee6e8ae11a92cea863dba276ab22005b684778e: Status 404 returned error can't find the container with id 9adba2b7cfbce7eb87cf5e1c0ee6e8ae11a92cea863dba276ab22005b684778e Mar 17 11:11:39 crc kubenswrapper[4742]: W0317 11:11:39.462402 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-834a97f4b6b46d3d53826203d455be599e81fbd9d6a1c4029e89b404c13d61e7 WatchSource:0}: Error finding container 834a97f4b6b46d3d53826203d455be599e81fbd9d6a1c4029e89b404c13d61e7: Status 404 returned error can't find the container with id 834a97f4b6b46d3d53826203d455be599e81fbd9d6a1c4029e89b404c13d61e7 Mar 17 11:11:39 crc kubenswrapper[4742]: W0317 11:11:39.463705 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-3d7a16d61abcc5fbec747ba263f6378d147e29ccc646823df8888fa55fea8dd9 WatchSource:0}: Error finding container 3d7a16d61abcc5fbec747ba263f6378d147e29ccc646823df8888fa55fea8dd9: Status 404 returned error can't find the container with id 3d7a16d61abcc5fbec747ba263f6378d147e29ccc646823df8888fa55fea8dd9 Mar 17 11:11:39 crc kubenswrapper[4742]: W0317 11:11:39.558398 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:39 crc kubenswrapper[4742]: E0317 11:11:39.558515 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.558565 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.560421 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.560458 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.560470 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.560496 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:11:39 crc kubenswrapper[4742]: E0317 11:11:39.561194 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.574054 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.667341 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9adba2b7cfbce7eb87cf5e1c0ee6e8ae11a92cea863dba276ab22005b684778e"} Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.668898 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2b3c1a5a4c60ec43bddea0226745577eb777ac3b2d2755eea17067555768df51"} Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.670559 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"aa3e6fc52266e208e2378c60a7f39053dec4ec70571a3821ef4771240e8e9c80"} Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.671720 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3d7a16d61abcc5fbec747ba263f6378d147e29ccc646823df8888fa55fea8dd9"} Mar 17 11:11:39 crc kubenswrapper[4742]: I0317 11:11:39.672830 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"834a97f4b6b46d3d53826203d455be599e81fbd9d6a1c4029e89b404c13d61e7"} Mar 17 11:11:39 crc kubenswrapper[4742]: W0317 11:11:39.841758 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:39 crc kubenswrapper[4742]: E0317 11:11:39.841842 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:39 crc kubenswrapper[4742]: E0317 11:11:39.996147 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="1.6s" Mar 17 11:11:40 crc kubenswrapper[4742]: W0317 11:11:40.135203 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:40 crc kubenswrapper[4742]: E0317 11:11:40.135279 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:40 crc kubenswrapper[4742]: I0317 11:11:40.266679 4742 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 11:11:40 crc kubenswrapper[4742]: E0317 11:11:40.267680 4742 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:40 crc kubenswrapper[4742]: I0317 11:11:40.361867 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:40 crc kubenswrapper[4742]: I0317 11:11:40.363240 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:40 crc kubenswrapper[4742]: I0317 11:11:40.363292 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:40 crc kubenswrapper[4742]: I0317 11:11:40.363310 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:40 crc kubenswrapper[4742]: I0317 11:11:40.363343 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:11:40 crc kubenswrapper[4742]: E0317 11:11:40.363802 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Mar 17 11:11:40 crc kubenswrapper[4742]: I0317 11:11:40.573352 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.573625 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:41 crc kubenswrapper[4742]: E0317 11:11:41.597722 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="3.2s" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.677755 4742 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9" exitCode=0 Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.677825 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9"} Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.677961 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.679544 4742 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d" exitCode=0 Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.679573 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d"} Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.679626 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.680228 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.680263 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.680275 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.680656 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.680693 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.680713 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.683280 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154"} Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.683313 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc"} Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.683328 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cae7df3f4ea292aace885f0fa3f3c6cdd8b702a84e22dccbb5cc9cd966d07764"} Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.683344 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249"} Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.683366 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.684594 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.684628 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.684640 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.685528 4742 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e" exitCode=0 Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.685601 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e"} Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.685648 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.686518 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.686615 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.686645 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.687429 4742 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8" exitCode=0 Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.687461 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8"} Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.687621 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.688718 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.688745 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.688757 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.689463 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.690573 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.690641 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.690764 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:41 crc kubenswrapper[4742]: W0317 11:11:41.854239 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:41 crc kubenswrapper[4742]: E0317 11:11:41.854331 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.926127 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.964803 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.965988 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.966043 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.966063 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:41 crc kubenswrapper[4742]: I0317 11:11:41.966109 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:11:41 crc kubenswrapper[4742]: E0317 11:11:41.966813 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.230:6443: connect: connection refused" node="crc" Mar 17 11:11:42 crc kubenswrapper[4742]: W0317 11:11:42.211542 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:42 crc kubenswrapper[4742]: E0317 11:11:42.211684 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:42 crc kubenswrapper[4742]: W0317 11:11:42.267101 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:42 crc kubenswrapper[4742]: E0317 11:11:42.267179 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.573297 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.694200 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae"} Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.694259 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873"} Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.694277 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe"} Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.694313 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd"} Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.696805 4742 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5" exitCode=0 Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.696875 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5"} Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.697057 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.697886 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.697929 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.697940 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.704356 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3"} Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.704388 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a"} Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.704407 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6"} Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.704455 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.705290 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.705310 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.705322 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.707268 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.707330 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.707246 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ef34e2c73260f5fc46fc0a526e4c1e5bd59861295b227901413b64b6d27a8a74"} Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.708372 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.708406 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.708416 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.709201 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.709223 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:42 crc kubenswrapper[4742]: I0317 11:11:42.709230 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:42 crc kubenswrapper[4742]: W0317 11:11:42.711156 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:42 crc kubenswrapper[4742]: E0317 11:11:42.711221 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.230:6443: connect: connection refused" logger="UnhandledError" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.343837 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:43 crc kubenswrapper[4742]: E0317 11:11:43.551194 4742 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.230:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189d9c7f3e2fbd13 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.571939091 +0000 UTC m=+1.698066849,LastTimestamp:2026-03-17 11:11:38.571939091 +0000 UTC m=+1.698066849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.573187 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.230:6443: connect: connection refused Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.713226 4742 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0" exitCode=0 Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.713313 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0"} Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.713398 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.714732 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.714768 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.714780 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.721014 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.721092 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.721133 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.721292 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.722002 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.722025 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.722035 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.722194 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e101c44cdd158b9a606ca75ac9de38a553b437cd93329144349b5567bd5c0558"} Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.722276 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.726269 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.726293 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.726311 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.726352 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.726385 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.726402 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.726417 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.726460 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:43 crc kubenswrapper[4742]: I0317 11:11:43.726480 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.451725 4742 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.731155 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47"} Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.731220 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.731244 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879"} Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.731270 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11"} Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.731188 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.731183 4742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.731402 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.735942 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.736013 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.736031 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.736349 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.736395 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.736399 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.736422 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.736437 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:44 crc kubenswrapper[4742]: I0317 11:11:44.736458 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:45 crc kubenswrapper[4742]: I0317 11:11:45.167397 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:45 crc kubenswrapper[4742]: I0317 11:11:45.169726 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:45 crc kubenswrapper[4742]: I0317 11:11:45.169806 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:45 crc kubenswrapper[4742]: I0317 11:11:45.169820 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:45 crc kubenswrapper[4742]: I0317 11:11:45.169862 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:11:45 crc kubenswrapper[4742]: I0317 11:11:45.737289 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00"} Mar 17 11:11:45 crc kubenswrapper[4742]: I0317 11:11:45.737344 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d"} Mar 17 11:11:45 crc kubenswrapper[4742]: I0317 11:11:45.737381 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:45 crc kubenswrapper[4742]: I0317 11:11:45.738202 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:45 crc kubenswrapper[4742]: I0317 11:11:45.738225 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:45 crc kubenswrapper[4742]: I0317 11:11:45.738233 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:46 crc kubenswrapper[4742]: I0317 11:11:46.343753 4742 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 11:11:46 crc kubenswrapper[4742]: I0317 11:11:46.343858 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 11:11:46 crc kubenswrapper[4742]: I0317 11:11:46.501466 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:46 crc kubenswrapper[4742]: I0317 11:11:46.501644 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:46 crc kubenswrapper[4742]: I0317 11:11:46.502822 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:46 crc kubenswrapper[4742]: I0317 11:11:46.502861 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:46 crc kubenswrapper[4742]: I0317 11:11:46.502874 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:46 crc kubenswrapper[4742]: I0317 11:11:46.627327 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 17 11:11:46 crc kubenswrapper[4742]: I0317 11:11:46.739592 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:46 crc kubenswrapper[4742]: I0317 11:11:46.740507 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:46 crc kubenswrapper[4742]: I0317 11:11:46.740531 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:46 crc kubenswrapper[4742]: I0317 11:11:46.740540 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.114840 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.115053 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.116925 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.116972 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.116983 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.266979 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.267236 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.268596 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.268638 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.268649 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.487453 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.743809 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.743810 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.745628 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.745679 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.745697 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.746246 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.746289 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.746300 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.930700 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.930871 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.932574 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.932691 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.932755 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:47 crc kubenswrapper[4742]: I0317 11:11:47.937414 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:48 crc kubenswrapper[4742]: I0317 11:11:48.747028 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:48 crc kubenswrapper[4742]: I0317 11:11:48.748709 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:48 crc kubenswrapper[4742]: I0317 11:11:48.748780 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:48 crc kubenswrapper[4742]: I0317 11:11:48.748799 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:48 crc kubenswrapper[4742]: E0317 11:11:48.858575 4742 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 11:11:52 crc kubenswrapper[4742]: I0317 11:11:52.916482 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 17 11:11:52 crc kubenswrapper[4742]: I0317 11:11:52.916685 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:52 crc kubenswrapper[4742]: I0317 11:11:52.918190 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:52 crc kubenswrapper[4742]: I0317 11:11:52.918257 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:52 crc kubenswrapper[4742]: I0317 11:11:52.918270 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:54 crc kubenswrapper[4742]: E0317 11:11:54.454448 4742 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 17 11:11:54 crc kubenswrapper[4742]: I0317 11:11:54.574120 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 17 11:11:54 crc kubenswrapper[4742]: I0317 11:11:54.762565 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 17 11:11:54 crc kubenswrapper[4742]: I0317 11:11:54.764714 4742 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e101c44cdd158b9a606ca75ac9de38a553b437cd93329144349b5567bd5c0558" exitCode=255 Mar 17 11:11:54 crc kubenswrapper[4742]: I0317 11:11:54.764753 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e101c44cdd158b9a606ca75ac9de38a553b437cd93329144349b5567bd5c0558"} Mar 17 11:11:54 crc kubenswrapper[4742]: I0317 11:11:54.764869 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:54 crc kubenswrapper[4742]: I0317 11:11:54.765608 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:54 crc kubenswrapper[4742]: I0317 11:11:54.765641 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:54 crc kubenswrapper[4742]: I0317 11:11:54.765649 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:54 crc kubenswrapper[4742]: I0317 11:11:54.766122 4742 scope.go:117] "RemoveContainer" containerID="e101c44cdd158b9a606ca75ac9de38a553b437cd93329144349b5567bd5c0558" Mar 17 11:11:54 crc kubenswrapper[4742]: E0317 11:11:54.799976 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 17 11:11:55 crc kubenswrapper[4742]: E0317 11:11:55.171712 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Mar 17 11:11:55 crc kubenswrapper[4742]: W0317 11:11:55.346577 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 17 11:11:55 crc kubenswrapper[4742]: I0317 11:11:55.346667 4742 trace.go:236] Trace[1435405565]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Mar-2026 11:11:45.345) (total time: 10001ms): Mar 17 11:11:55 crc kubenswrapper[4742]: Trace[1435405565]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:11:55.346) Mar 17 11:11:55 crc kubenswrapper[4742]: Trace[1435405565]: [10.001488895s] [10.001488895s] END Mar 17 11:11:55 crc kubenswrapper[4742]: E0317 11:11:55.346691 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 17 11:11:55 crc kubenswrapper[4742]: W0317 11:11:55.558890 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:11:55Z is after 2026-02-23T05:33:13Z Mar 17 11:11:55 crc kubenswrapper[4742]: E0317 11:11:55.559639 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:11:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 11:11:55 crc kubenswrapper[4742]: W0317 11:11:55.561448 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:11:55Z is after 2026-02-23T05:33:13Z Mar 17 11:11:55 crc kubenswrapper[4742]: E0317 11:11:55.561618 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:11:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 11:11:55 crc kubenswrapper[4742]: E0317 11:11:55.561321 4742 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:11:55Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d9c7f3e2fbd13 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.571939091 +0000 UTC m=+1.698066849,LastTimestamp:2026-03-17 11:11:38.571939091 +0000 UTC m=+1.698066849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:11:55 crc kubenswrapper[4742]: W0317 11:11:55.563337 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:11:55Z is after 2026-02-23T05:33:13Z Mar 17 11:11:55 crc kubenswrapper[4742]: E0317 11:11:55.563427 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:11:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 11:11:55 crc kubenswrapper[4742]: I0317 11:11:55.563662 4742 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 17 11:11:55 crc kubenswrapper[4742]: I0317 11:11:55.563811 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 17 11:11:55 crc kubenswrapper[4742]: I0317 11:11:55.568373 4742 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 17 11:11:55 crc kubenswrapper[4742]: I0317 11:11:55.568442 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 17 11:11:55 crc kubenswrapper[4742]: I0317 11:11:55.576741 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:11:55Z is after 2026-02-23T05:33:13Z Mar 17 11:11:55 crc kubenswrapper[4742]: I0317 11:11:55.769866 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 17 11:11:55 crc kubenswrapper[4742]: I0317 11:11:55.771584 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a73472408141359b54c99a824697f22be4aad5b9c22c0df8e77545e98176acf3"} Mar 17 11:11:55 crc kubenswrapper[4742]: I0317 11:11:55.771801 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:55 crc kubenswrapper[4742]: I0317 11:11:55.773001 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:55 crc kubenswrapper[4742]: I0317 11:11:55.773047 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:55 crc kubenswrapper[4742]: I0317 11:11:55.773061 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.344763 4742 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.344840 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.502497 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.577128 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:11:56Z is after 2026-02-23T05:33:13Z Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.775580 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.775978 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.777479 4742 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a73472408141359b54c99a824697f22be4aad5b9c22c0df8e77545e98176acf3" exitCode=255 Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.777521 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a73472408141359b54c99a824697f22be4aad5b9c22c0df8e77545e98176acf3"} Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.777570 4742 scope.go:117] "RemoveContainer" containerID="e101c44cdd158b9a606ca75ac9de38a553b437cd93329144349b5567bd5c0558" Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.777576 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.778377 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.778405 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.778417 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:56 crc kubenswrapper[4742]: I0317 11:11:56.778983 4742 scope.go:117] "RemoveContainer" containerID="a73472408141359b54c99a824697f22be4aad5b9c22c0df8e77545e98176acf3" Mar 17 11:11:56 crc kubenswrapper[4742]: E0317 11:11:56.779174 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.119355 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.119531 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.121292 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.121330 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.121339 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.276760 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.490979 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.575578 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:11:57Z is after 2026-02-23T05:33:13Z Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.782389 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.784606 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.785432 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.785500 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.785515 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:57 crc kubenswrapper[4742]: I0317 11:11:57.786408 4742 scope.go:117] "RemoveContainer" containerID="a73472408141359b54c99a824697f22be4aad5b9c22c0df8e77545e98176acf3" Mar 17 11:11:57 crc kubenswrapper[4742]: E0317 11:11:57.786660 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 11:11:58 crc kubenswrapper[4742]: I0317 11:11:58.577993 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:11:58Z is after 2026-02-23T05:33:13Z Mar 17 11:11:58 crc kubenswrapper[4742]: I0317 11:11:58.787562 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:11:58 crc kubenswrapper[4742]: I0317 11:11:58.788886 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:11:58 crc kubenswrapper[4742]: I0317 11:11:58.788961 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:11:58 crc kubenswrapper[4742]: I0317 11:11:58.788976 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:11:58 crc kubenswrapper[4742]: I0317 11:11:58.789742 4742 scope.go:117] "RemoveContainer" containerID="a73472408141359b54c99a824697f22be4aad5b9c22c0df8e77545e98176acf3" Mar 17 11:11:58 crc kubenswrapper[4742]: E0317 11:11:58.789943 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 11:11:58 crc kubenswrapper[4742]: E0317 11:11:58.858869 4742 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 11:11:59 crc kubenswrapper[4742]: I0317 11:11:59.577824 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:11:59Z is after 2026-02-23T05:33:13Z Mar 17 11:12:00 crc kubenswrapper[4742]: I0317 11:12:00.575603 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:00Z is after 2026-02-23T05:33:13Z Mar 17 11:12:01 crc kubenswrapper[4742]: E0317 11:12:01.205702 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:01Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 17 11:12:01 crc kubenswrapper[4742]: I0317 11:12:01.571888 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:01 crc kubenswrapper[4742]: I0317 11:12:01.573884 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:01 crc kubenswrapper[4742]: I0317 11:12:01.573991 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:01 crc kubenswrapper[4742]: I0317 11:12:01.574014 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:01 crc kubenswrapper[4742]: I0317 11:12:01.574061 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:12:01 crc kubenswrapper[4742]: I0317 11:12:01.576982 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:01Z is after 2026-02-23T05:33:13Z Mar 17 11:12:01 crc kubenswrapper[4742]: E0317 11:12:01.579311 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:01Z is after 2026-02-23T05:33:13Z" node="crc" Mar 17 11:12:02 crc kubenswrapper[4742]: I0317 11:12:02.577642 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:02Z is after 2026-02-23T05:33:13Z Mar 17 11:12:02 crc kubenswrapper[4742]: W0317 11:12:02.761388 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:02Z is after 2026-02-23T05:33:13Z Mar 17 11:12:02 crc kubenswrapper[4742]: E0317 11:12:02.761463 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 11:12:02 crc kubenswrapper[4742]: I0317 11:12:02.951985 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 17 11:12:02 crc kubenswrapper[4742]: I0317 11:12:02.952160 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:02 crc kubenswrapper[4742]: I0317 11:12:02.953630 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:02 crc kubenswrapper[4742]: I0317 11:12:02.953684 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:02 crc kubenswrapper[4742]: I0317 11:12:02.953694 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:02 crc kubenswrapper[4742]: I0317 11:12:02.967119 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 17 11:12:03 crc kubenswrapper[4742]: I0317 11:12:03.021736 4742 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 11:12:03 crc kubenswrapper[4742]: E0317 11:12:03.027382 4742 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 11:12:03 crc kubenswrapper[4742]: I0317 11:12:03.577340 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:03Z is after 2026-02-23T05:33:13Z Mar 17 11:12:03 crc kubenswrapper[4742]: I0317 11:12:03.798237 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:03 crc kubenswrapper[4742]: I0317 11:12:03.802115 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:03 crc kubenswrapper[4742]: I0317 11:12:03.802443 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:03 crc kubenswrapper[4742]: I0317 11:12:03.802559 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:04 crc kubenswrapper[4742]: W0317 11:12:04.149679 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:04Z is after 2026-02-23T05:33:13Z Mar 17 11:12:04 crc kubenswrapper[4742]: E0317 11:12:04.150071 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 11:12:04 crc kubenswrapper[4742]: I0317 11:12:04.576740 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:04Z is after 2026-02-23T05:33:13Z Mar 17 11:12:05 crc kubenswrapper[4742]: E0317 11:12:05.564888 4742 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:05Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d9c7f3e2fbd13 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.571939091 +0000 UTC m=+1.698066849,LastTimestamp:2026-03-17 11:11:38.571939091 +0000 UTC m=+1.698066849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:05 crc kubenswrapper[4742]: I0317 11:12:05.577440 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:05Z is after 2026-02-23T05:33:13Z Mar 17 11:12:05 crc kubenswrapper[4742]: W0317 11:12:05.950431 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:05Z is after 2026-02-23T05:33:13Z Mar 17 11:12:05 crc kubenswrapper[4742]: E0317 11:12:05.950596 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.335982 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.336503 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.337898 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.338016 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.338044 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.339106 4742 scope.go:117] "RemoveContainer" containerID="a73472408141359b54c99a824697f22be4aad5b9c22c0df8e77545e98176acf3" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.345758 4742 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.345829 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.345934 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.346161 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.347518 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.347552 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.347568 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.348321 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"cae7df3f4ea292aace885f0fa3f3c6cdd8b702a84e22dccbb5cc9cd966d07764"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.348533 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://cae7df3f4ea292aace885f0fa3f3c6cdd8b702a84e22dccbb5cc9cd966d07764" gracePeriod=30 Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.576570 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:06Z is after 2026-02-23T05:33:13Z Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.810585 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.812843 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a06f4ae84833508b5a46cd3cf142fc9e1ca5de2f9de2bb120142f3a9d9428b7b"} Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.813009 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.813975 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.814011 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.814032 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.817309 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.817617 4742 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="cae7df3f4ea292aace885f0fa3f3c6cdd8b702a84e22dccbb5cc9cd966d07764" exitCode=255 Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.817650 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"cae7df3f4ea292aace885f0fa3f3c6cdd8b702a84e22dccbb5cc9cd966d07764"} Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.817672 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223"} Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.817747 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.818505 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.818530 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:06 crc kubenswrapper[4742]: I0317 11:12:06.818541 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:07 crc kubenswrapper[4742]: I0317 11:12:07.576068 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:07Z is after 2026-02-23T05:33:13Z Mar 17 11:12:07 crc kubenswrapper[4742]: I0317 11:12:07.822279 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 17 11:12:07 crc kubenswrapper[4742]: I0317 11:12:07.822887 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 17 11:12:07 crc kubenswrapper[4742]: I0317 11:12:07.826070 4742 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a06f4ae84833508b5a46cd3cf142fc9e1ca5de2f9de2bb120142f3a9d9428b7b" exitCode=255 Mar 17 11:12:07 crc kubenswrapper[4742]: I0317 11:12:07.826142 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a06f4ae84833508b5a46cd3cf142fc9e1ca5de2f9de2bb120142f3a9d9428b7b"} Mar 17 11:12:07 crc kubenswrapper[4742]: I0317 11:12:07.826238 4742 scope.go:117] "RemoveContainer" containerID="a73472408141359b54c99a824697f22be4aad5b9c22c0df8e77545e98176acf3" Mar 17 11:12:07 crc kubenswrapper[4742]: I0317 11:12:07.826461 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:07 crc kubenswrapper[4742]: I0317 11:12:07.831241 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:07 crc kubenswrapper[4742]: I0317 11:12:07.831278 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:07 crc kubenswrapper[4742]: I0317 11:12:07.831289 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:07 crc kubenswrapper[4742]: I0317 11:12:07.832016 4742 scope.go:117] "RemoveContainer" containerID="a06f4ae84833508b5a46cd3cf142fc9e1ca5de2f9de2bb120142f3a9d9428b7b" Mar 17 11:12:07 crc kubenswrapper[4742]: E0317 11:12:07.832254 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 11:12:07 crc kubenswrapper[4742]: W0317 11:12:07.933857 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:07Z is after 2026-02-23T05:33:13Z Mar 17 11:12:07 crc kubenswrapper[4742]: E0317 11:12:07.934016 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 11:12:08 crc kubenswrapper[4742]: E0317 11:12:08.208732 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:08Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 17 11:12:08 crc kubenswrapper[4742]: I0317 11:12:08.576410 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:08Z is after 2026-02-23T05:33:13Z Mar 17 11:12:08 crc kubenswrapper[4742]: I0317 11:12:08.579421 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:08 crc kubenswrapper[4742]: I0317 11:12:08.581325 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:08 crc kubenswrapper[4742]: I0317 11:12:08.581396 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:08 crc kubenswrapper[4742]: I0317 11:12:08.581414 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:08 crc kubenswrapper[4742]: I0317 11:12:08.581459 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:12:08 crc kubenswrapper[4742]: E0317 11:12:08.584546 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:08Z is after 2026-02-23T05:33:13Z" node="crc" Mar 17 11:12:08 crc kubenswrapper[4742]: I0317 11:12:08.830549 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 17 11:12:08 crc kubenswrapper[4742]: E0317 11:12:08.859155 4742 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 11:12:09 crc kubenswrapper[4742]: I0317 11:12:09.576038 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:09Z is after 2026-02-23T05:33:13Z Mar 17 11:12:10 crc kubenswrapper[4742]: I0317 11:12:10.576764 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:10Z is after 2026-02-23T05:33:13Z Mar 17 11:12:11 crc kubenswrapper[4742]: I0317 11:12:11.576125 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:11Z is after 2026-02-23T05:33:13Z Mar 17 11:12:11 crc kubenswrapper[4742]: I0317 11:12:11.927107 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:12:11 crc kubenswrapper[4742]: I0317 11:12:11.927301 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:11 crc kubenswrapper[4742]: I0317 11:12:11.928781 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:11 crc kubenswrapper[4742]: I0317 11:12:11.928855 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:11 crc kubenswrapper[4742]: I0317 11:12:11.928877 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:12 crc kubenswrapper[4742]: I0317 11:12:12.579031 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:12Z is after 2026-02-23T05:33:13Z Mar 17 11:12:13 crc kubenswrapper[4742]: I0317 11:12:13.344411 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:12:13 crc kubenswrapper[4742]: I0317 11:12:13.345376 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:13 crc kubenswrapper[4742]: I0317 11:12:13.346901 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:13 crc kubenswrapper[4742]: I0317 11:12:13.346954 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:13 crc kubenswrapper[4742]: I0317 11:12:13.346967 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:13 crc kubenswrapper[4742]: I0317 11:12:13.578481 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:13Z is after 2026-02-23T05:33:13Z Mar 17 11:12:14 crc kubenswrapper[4742]: I0317 11:12:14.576618 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:14Z is after 2026-02-23T05:33:13Z Mar 17 11:12:15 crc kubenswrapper[4742]: E0317 11:12:15.215069 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:15Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 17 11:12:15 crc kubenswrapper[4742]: E0317 11:12:15.569440 4742 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:15Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d9c7f3e2fbd13 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.571939091 +0000 UTC m=+1.698066849,LastTimestamp:2026-03-17 11:11:38.571939091 +0000 UTC m=+1.698066849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:15 crc kubenswrapper[4742]: I0317 11:12:15.575828 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:15Z is after 2026-02-23T05:33:13Z Mar 17 11:12:15 crc kubenswrapper[4742]: I0317 11:12:15.585068 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:15 crc kubenswrapper[4742]: I0317 11:12:15.586629 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:15 crc kubenswrapper[4742]: I0317 11:12:15.586732 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:15 crc kubenswrapper[4742]: I0317 11:12:15.586748 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:15 crc kubenswrapper[4742]: I0317 11:12:15.586786 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:12:15 crc kubenswrapper[4742]: E0317 11:12:15.590186 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:15Z is after 2026-02-23T05:33:13Z" node="crc" Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.336191 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.336477 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.338185 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.338297 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.338332 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.339331 4742 scope.go:117] "RemoveContainer" containerID="a06f4ae84833508b5a46cd3cf142fc9e1ca5de2f9de2bb120142f3a9d9428b7b" Mar 17 11:12:16 crc kubenswrapper[4742]: E0317 11:12:16.339637 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.345018 4742 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.345128 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.502610 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.577828 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:16Z is after 2026-02-23T05:33:13Z Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.866252 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.867606 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.867670 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.867690 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:16 crc kubenswrapper[4742]: I0317 11:12:16.868639 4742 scope.go:117] "RemoveContainer" containerID="a06f4ae84833508b5a46cd3cf142fc9e1ca5de2f9de2bb120142f3a9d9428b7b" Mar 17 11:12:16 crc kubenswrapper[4742]: E0317 11:12:16.869002 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 11:12:17 crc kubenswrapper[4742]: I0317 11:12:17.577065 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:17Z is after 2026-02-23T05:33:13Z Mar 17 11:12:18 crc kubenswrapper[4742]: I0317 11:12:18.575964 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:18Z is after 2026-02-23T05:33:13Z Mar 17 11:12:18 crc kubenswrapper[4742]: E0317 11:12:18.859339 4742 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 11:12:19 crc kubenswrapper[4742]: I0317 11:12:19.576974 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:19Z is after 2026-02-23T05:33:13Z Mar 17 11:12:19 crc kubenswrapper[4742]: I0317 11:12:19.950339 4742 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 11:12:19 crc kubenswrapper[4742]: E0317 11:12:19.956116 4742 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 11:12:19 crc kubenswrapper[4742]: E0317 11:12:19.957415 4742 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 17 11:12:20 crc kubenswrapper[4742]: I0317 11:12:20.577489 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:20Z is after 2026-02-23T05:33:13Z Mar 17 11:12:21 crc kubenswrapper[4742]: I0317 11:12:21.577586 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:21Z is after 2026-02-23T05:33:13Z Mar 17 11:12:22 crc kubenswrapper[4742]: E0317 11:12:22.219051 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:22Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 17 11:12:22 crc kubenswrapper[4742]: I0317 11:12:22.578851 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:22Z is after 2026-02-23T05:33:13Z Mar 17 11:12:22 crc kubenswrapper[4742]: I0317 11:12:22.590727 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:22 crc kubenswrapper[4742]: I0317 11:12:22.592237 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:22 crc kubenswrapper[4742]: I0317 11:12:22.592284 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:22 crc kubenswrapper[4742]: I0317 11:12:22.592294 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:22 crc kubenswrapper[4742]: I0317 11:12:22.592325 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:12:22 crc kubenswrapper[4742]: E0317 11:12:22.597634 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:22Z is after 2026-02-23T05:33:13Z" node="crc" Mar 17 11:12:23 crc kubenswrapper[4742]: I0317 11:12:23.577995 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:23Z is after 2026-02-23T05:33:13Z Mar 17 11:12:23 crc kubenswrapper[4742]: W0317 11:12:23.649168 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:23Z is after 2026-02-23T05:33:13Z Mar 17 11:12:23 crc kubenswrapper[4742]: E0317 11:12:23.650207 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:12:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 17 11:12:24 crc kubenswrapper[4742]: I0317 11:12:24.577364 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:25 crc kubenswrapper[4742]: W0317 11:12:25.051435 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.051586 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 17 11:12:25 crc kubenswrapper[4742]: I0317 11:12:25.578603 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.578569 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f3e2fbd13 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.571939091 +0000 UTC m=+1.698066849,LastTimestamp:2026-03-17 11:11:38.571939091 +0000 UTC m=+1.698066849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.586184 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f420062ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635944634 +0000 UTC m=+1.762072392,LastTimestamp:2026-03-17 11:11:38.635944634 +0000 UTC m=+1.762072392,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.591713 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200c214 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635969044 +0000 UTC m=+1.762096802,LastTimestamp:2026-03-17 11:11:38.635969044 +0000 UTC m=+1.762096802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.596546 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200e1d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635977174 +0000 UTC m=+1.762104932,LastTimestamp:2026-03-17 11:11:38.635977174 +0000 UTC m=+1.762104932,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.602815 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4f4cac3b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.859047995 +0000 UTC m=+1.985175753,LastTimestamp:2026-03-17 11:11:38.859047995 +0000 UTC m=+1.985175753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.607741 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f420062ba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f420062ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635944634 +0000 UTC m=+1.762072392,LastTimestamp:2026-03-17 11:11:38.954891903 +0000 UTC m=+2.081019661,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.611852 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f4200c214\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200c214 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635969044 +0000 UTC m=+1.762096802,LastTimestamp:2026-03-17 11:11:38.954935933 +0000 UTC m=+2.081063691,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.615741 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f4200e1d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200e1d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635977174 +0000 UTC m=+1.762104932,LastTimestamp:2026-03-17 11:11:38.954948244 +0000 UTC m=+2.081076002,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.619633 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f420062ba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f420062ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635944634 +0000 UTC m=+1.762072392,LastTimestamp:2026-03-17 11:11:38.963902492 +0000 UTC m=+2.090030290,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.625989 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f4200c214\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200c214 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635969044 +0000 UTC m=+1.762096802,LastTimestamp:2026-03-17 11:11:38.963958853 +0000 UTC m=+2.090086651,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.630646 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f4200e1d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200e1d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635977174 +0000 UTC m=+1.762104932,LastTimestamp:2026-03-17 11:11:38.963975913 +0000 UTC m=+2.090103711,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.635859 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f420062ba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f420062ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635944634 +0000 UTC m=+1.762072392,LastTimestamp:2026-03-17 11:11:38.965865112 +0000 UTC m=+2.091992870,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.640259 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f4200c214\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200c214 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635969044 +0000 UTC m=+1.762096802,LastTimestamp:2026-03-17 11:11:38.965885742 +0000 UTC m=+2.092013500,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.644362 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f4200e1d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200e1d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635977174 +0000 UTC m=+1.762104932,LastTimestamp:2026-03-17 11:11:38.965893813 +0000 UTC m=+2.092021571,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.648800 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f420062ba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f420062ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635944634 +0000 UTC m=+1.762072392,LastTimestamp:2026-03-17 11:11:38.966598794 +0000 UTC m=+2.092726552,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.654642 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f4200c214\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200c214 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635969044 +0000 UTC m=+1.762096802,LastTimestamp:2026-03-17 11:11:38.966619134 +0000 UTC m=+2.092746892,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.659047 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f4200e1d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200e1d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635977174 +0000 UTC m=+1.762104932,LastTimestamp:2026-03-17 11:11:38.966629464 +0000 UTC m=+2.092757222,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.664372 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f420062ba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f420062ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635944634 +0000 UTC m=+1.762072392,LastTimestamp:2026-03-17 11:11:38.968038376 +0000 UTC m=+2.094166134,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.666035 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f4200c214\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200c214 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635969044 +0000 UTC m=+1.762096802,LastTimestamp:2026-03-17 11:11:38.968048356 +0000 UTC m=+2.094176114,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.672532 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f4200e1d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200e1d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635977174 +0000 UTC m=+1.762104932,LastTimestamp:2026-03-17 11:11:38.968056096 +0000 UTC m=+2.094183854,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.677152 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f420062ba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f420062ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635944634 +0000 UTC m=+1.762072392,LastTimestamp:2026-03-17 11:11:38.968160188 +0000 UTC m=+2.094287976,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.681317 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f4200c214\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200c214 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635969044 +0000 UTC m=+1.762096802,LastTimestamp:2026-03-17 11:11:38.968182668 +0000 UTC m=+2.094310466,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.688019 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f4200e1d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200e1d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635977174 +0000 UTC m=+1.762104932,LastTimestamp:2026-03-17 11:11:38.968230849 +0000 UTC m=+2.094358637,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.692776 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f420062ba\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f420062ba default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635944634 +0000 UTC m=+1.762072392,LastTimestamp:2026-03-17 11:11:38.969446618 +0000 UTC m=+2.095574376,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.697003 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d9c7f4200c214\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d9c7f4200c214 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:38.635969044 +0000 UTC m=+1.762096802,LastTimestamp:2026-03-17 11:11:38.969463638 +0000 UTC m=+2.095591396,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.704789 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c7f73d11042 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:39.47170413 +0000 UTC m=+2.597831888,LastTimestamp:2026-03-17 11:11:39.47170413 +0000 UTC m=+2.597831888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.709177 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c7f73d148a0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:39.47171856 +0000 UTC m=+2.597846358,LastTimestamp:2026-03-17 11:11:39.47171856 +0000 UTC m=+2.597846358,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.714492 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7f73d9c2c7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:39.472274119 +0000 UTC m=+2.598401877,LastTimestamp:2026-03-17 11:11:39.472274119 +0000 UTC m=+2.598401877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.719423 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d9c7f73d9c60f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:39.472274959 +0000 UTC m=+2.598402747,LastTimestamp:2026-03-17 11:11:39.472274959 +0000 UTC m=+2.598402747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.724066 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d9c7f74ef8c61 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:39.490479201 +0000 UTC m=+2.616606989,LastTimestamp:2026-03-17 11:11:39.490479201 +0000 UTC m=+2.616606989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.729857 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7fbbc4447b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:40.678825083 +0000 UTC m=+3.804952881,LastTimestamp:2026-03-17 11:11:40.678825083 +0000 UTC m=+3.804952881,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.734289 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d9c7fbbc508e2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:40.678875362 +0000 UTC m=+3.805003130,LastTimestamp:2026-03-17 11:11:40.678875362 +0000 UTC m=+3.805003130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.738418 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d9c7fbbc530d7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:40.678885591 +0000 UTC m=+3.805013359,LastTimestamp:2026-03-17 11:11:40.678885591 +0000 UTC m=+3.805013359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.742791 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c7fbbc6d101 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:40.678992129 +0000 UTC m=+3.805119887,LastTimestamp:2026-03-17 11:11:40.678992129 +0000 UTC m=+3.805119887,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.746548 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c7fbbc8acc6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:40.679113926 +0000 UTC m=+3.805241704,LastTimestamp:2026-03-17 11:11:40.679113926 +0000 UTC m=+3.805241704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.751626 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d9c7fbc7faecd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:40.691107533 +0000 UTC m=+3.817235291,LastTimestamp:2026-03-17 11:11:40.691107533 +0000 UTC m=+3.817235291,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.755855 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c7fbcdc134f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:40.697162575 +0000 UTC m=+3.823290333,LastTimestamp:2026-03-17 11:11:40.697162575 +0000 UTC m=+3.823290333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.759714 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c7fbd0eec7d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:40.700494973 +0000 UTC m=+3.826622731,LastTimestamp:2026-03-17 11:11:40.700494973 +0000 UTC m=+3.826622731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.763199 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7fbd109228 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:40.70060292 +0000 UTC m=+3.826730678,LastTimestamp:2026-03-17 11:11:40.70060292 +0000 UTC m=+3.826730678,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.766965 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d9c7fbd163932 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:40.700973362 +0000 UTC m=+3.827101160,LastTimestamp:2026-03-17 11:11:40.700973362 +0000 UTC m=+3.827101160,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.768598 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7fbd28dfce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:40.702195662 +0000 UTC m=+3.828323460,LastTimestamp:2026-03-17 11:11:40.702195662 +0000 UTC m=+3.828323460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.771186 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7fd0a562f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.029122807 +0000 UTC m=+4.155250605,LastTimestamp:2026-03-17 11:11:41.029122807 +0000 UTC m=+4.155250605,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.775189 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7fd1bf6dc5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.047606725 +0000 UTC m=+4.173734523,LastTimestamp:2026-03-17 11:11:41.047606725 +0000 UTC m=+4.173734523,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.779269 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7fd1d2b302 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.048869634 +0000 UTC m=+4.174997432,LastTimestamp:2026-03-17 11:11:41.048869634 +0000 UTC m=+4.174997432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.783365 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7fde5d5735 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.259282229 +0000 UTC m=+4.385409987,LastTimestamp:2026-03-17 11:11:41.259282229 +0000 UTC m=+4.385409987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.787004 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7fe091a4cb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.296264395 +0000 UTC m=+4.422392153,LastTimestamp:2026-03-17 11:11:41.296264395 +0000 UTC m=+4.422392153,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.791731 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7fe0a68de7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.297634791 +0000 UTC m=+4.423762589,LastTimestamp:2026-03-17 11:11:41.297634791 +0000 UTC m=+4.423762589,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.799856 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7feb5e18a2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.477435554 +0000 UTC m=+4.603563322,LastTimestamp:2026-03-17 11:11:41.477435554 +0000 UTC m=+4.603563322,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.804226 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7fed1e10fe openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.506793726 +0000 UTC m=+4.632921484,LastTimestamp:2026-03-17 11:11:41.506793726 +0000 UTC m=+4.632921484,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.809681 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d9c7ff7884c10 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.681527824 +0000 UTC m=+4.807655592,LastTimestamp:2026-03-17 11:11:41.681527824 +0000 UTC m=+4.807655592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.815098 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d9c7ff79431f7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.682307575 +0000 UTC m=+4.808435333,LastTimestamp:2026-03-17 11:11:41.682307575 +0000 UTC m=+4.808435333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.820967 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c7ff7fe85bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.689275835 +0000 UTC m=+4.815403613,LastTimestamp:2026-03-17 11:11:41.689275835 +0000 UTC m=+4.815403613,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.825950 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c7ff80c1548 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.690164552 +0000 UTC m=+4.816292330,LastTimestamp:2026-03-17 11:11:41.690164552 +0000 UTC m=+4.816292330,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.830280 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d9c8004a75627 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.901665831 +0000 UTC m=+5.027793589,LastTimestamp:2026-03-17 11:11:41.901665831 +0000 UTC m=+5.027793589,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.834975 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d9c8004a9ee14 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.901835796 +0000 UTC m=+5.027963574,LastTimestamp:2026-03-17 11:11:41.901835796 +0000 UTC m=+5.027963574,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.840211 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c8004ac1673 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.901977203 +0000 UTC m=+5.028104961,LastTimestamp:2026-03-17 11:11:41.901977203 +0000 UTC m=+5.028104961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.844835 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c8004bb5f8e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.902978958 +0000 UTC m=+5.029106716,LastTimestamp:2026-03-17 11:11:41.902978958 +0000 UTC m=+5.029106716,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.854711 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d9c80056abf4b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.914472267 +0000 UTC m=+5.040600025,LastTimestamp:2026-03-17 11:11:41.914472267 +0000 UTC m=+5.040600025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.860875 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d9c80057c5fa5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.915627429 +0000 UTC m=+5.041755187,LastTimestamp:2026-03-17 11:11:41.915627429 +0000 UTC m=+5.041755187,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.866038 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c8006a88c86 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.935299718 +0000 UTC m=+5.061427476,LastTimestamp:2026-03-17 11:11:41.935299718 +0000 UTC m=+5.061427476,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.872881 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c8006ea5df5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.939613173 +0000 UTC m=+5.065740931,LastTimestamp:2026-03-17 11:11:41.939613173 +0000 UTC m=+5.065740931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.876867 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c80082b9ecc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.960666828 +0000 UTC m=+5.086794596,LastTimestamp:2026-03-17 11:11:41.960666828 +0000 UTC m=+5.086794596,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.878854 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d9c8008a76c0b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.968780299 +0000 UTC m=+5.094908057,LastTimestamp:2026-03-17 11:11:41.968780299 +0000 UTC m=+5.094908057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.883988 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d9c8013d5c4d7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.156367063 +0000 UTC m=+5.282494811,LastTimestamp:2026-03-17 11:11:42.156367063 +0000 UTC m=+5.282494811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.887736 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c80145ce0d3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.165221587 +0000 UTC m=+5.291349345,LastTimestamp:2026-03-17 11:11:42.165221587 +0000 UTC m=+5.291349345,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.891550 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d9c801592bd18 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.1855286 +0000 UTC m=+5.311656358,LastTimestamp:2026-03-17 11:11:42.1855286 +0000 UTC m=+5.311656358,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.894932 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d9c8015a0505c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.186418268 +0000 UTC m=+5.312546026,LastTimestamp:2026-03-17 11:11:42.186418268 +0000 UTC m=+5.312546026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.898353 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c8015f4968f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.191941263 +0000 UTC m=+5.318069021,LastTimestamp:2026-03-17 11:11:42.191941263 +0000 UTC m=+5.318069021,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.902237 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c80160947c6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.19329735 +0000 UTC m=+5.319425108,LastTimestamp:2026-03-17 11:11:42.19329735 +0000 UTC m=+5.319425108,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.905548 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c8023d6a639 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.424860217 +0000 UTC m=+5.550987975,LastTimestamp:2026-03-17 11:11:42.424860217 +0000 UTC m=+5.550987975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.909516 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d9c8024079952 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.428068178 +0000 UTC m=+5.554195936,LastTimestamp:2026-03-17 11:11:42.428068178 +0000 UTC m=+5.554195936,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.913183 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c8024e49061 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.442549345 +0000 UTC m=+5.568677103,LastTimestamp:2026-03-17 11:11:42.442549345 +0000 UTC m=+5.568677103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.918993 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c8024ffb707 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.444328711 +0000 UTC m=+5.570456469,LastTimestamp:2026-03-17 11:11:42.444328711 +0000 UTC m=+5.570456469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.924555 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d9c8025d09dff openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.458019327 +0000 UTC m=+5.584147085,LastTimestamp:2026-03-17 11:11:42.458019327 +0000 UTC m=+5.584147085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.930233 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c802f867b31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.620932913 +0000 UTC m=+5.747060671,LastTimestamp:2026-03-17 11:11:42.620932913 +0000 UTC m=+5.747060671,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.934311 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c803074e750 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.63655816 +0000 UTC m=+5.762685918,LastTimestamp:2026-03-17 11:11:42.63655816 +0000 UTC m=+5.762685918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.938278 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c80308b730c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.638035724 +0000 UTC m=+5.764163482,LastTimestamp:2026-03-17 11:11:42.638035724 +0000 UTC m=+5.764163482,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.943340 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c80343e697c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.700095868 +0000 UTC m=+5.826223646,LastTimestamp:2026-03-17 11:11:42.700095868 +0000 UTC m=+5.826223646,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.951127 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c8040b7bd4a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.90937377 +0000 UTC m=+6.035501528,LastTimestamp:2026-03-17 11:11:42.90937377 +0000 UTC m=+6.035501528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.956090 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c8040f6c1dd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.913503709 +0000 UTC m=+6.039631467,LastTimestamp:2026-03-17 11:11:42.913503709 +0000 UTC m=+6.039631467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.962463 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c804228d68a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.933563018 +0000 UTC m=+6.059690776,LastTimestamp:2026-03-17 11:11:42.933563018 +0000 UTC m=+6.059690776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.969393 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c80425b4d54 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.936870228 +0000 UTC m=+6.062997986,LastTimestamp:2026-03-17 11:11:42.936870228 +0000 UTC m=+6.062997986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.975410 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c8070d205c8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:43.716402632 +0000 UTC m=+6.842530400,LastTimestamp:2026-03-17 11:11:43.716402632 +0000 UTC m=+6.842530400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.978141 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c807fbdb702 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:43.966729986 +0000 UTC m=+7.092857754,LastTimestamp:2026-03-17 11:11:43.966729986 +0000 UTC m=+7.092857754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.983886 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c80805ebc82 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:43.97728269 +0000 UTC m=+7.103410448,LastTimestamp:2026-03-17 11:11:43.97728269 +0000 UTC m=+7.103410448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.988539 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c80807728d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:43.978883283 +0000 UTC m=+7.105011041,LastTimestamp:2026-03-17 11:11:43.978883283 +0000 UTC m=+7.105011041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:25 crc kubenswrapper[4742]: E0317 11:12:25.994644 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c808ed89480 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:44.220148864 +0000 UTC m=+7.346276662,LastTimestamp:2026-03-17 11:11:44.220148864 +0000 UTC m=+7.346276662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.000788 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c808fc62978 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:44.235719032 +0000 UTC m=+7.361846790,LastTimestamp:2026-03-17 11:11:44.235719032 +0000 UTC m=+7.361846790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.004557 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c808fe03cb1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:44.237427889 +0000 UTC m=+7.363555647,LastTimestamp:2026-03-17 11:11:44.237427889 +0000 UTC m=+7.363555647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.010054 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c809ef5042b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:44.490447915 +0000 UTC m=+7.616575663,LastTimestamp:2026-03-17 11:11:44.490447915 +0000 UTC m=+7.616575663,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.016074 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c809faf0f63 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:44.502640483 +0000 UTC m=+7.628768241,LastTimestamp:2026-03-17 11:11:44.502640483 +0000 UTC m=+7.628768241,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.021113 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c809fc5ec1f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:44.504138783 +0000 UTC m=+7.630266551,LastTimestamp:2026-03-17 11:11:44.504138783 +0000 UTC m=+7.630266551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.026661 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c80acd36fb5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:44.723128245 +0000 UTC m=+7.849256003,LastTimestamp:2026-03-17 11:11:44.723128245 +0000 UTC m=+7.849256003,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.033189 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c80adc4b7a8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:44.73894084 +0000 UTC m=+7.865068598,LastTimestamp:2026-03-17 11:11:44.73894084 +0000 UTC m=+7.865068598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.039307 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c80add8c302 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:44.740254466 +0000 UTC m=+7.866382224,LastTimestamp:2026-03-17 11:11:44.740254466 +0000 UTC m=+7.866382224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.045676 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c80bc70ea3a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:44.985107002 +0000 UTC m=+8.111234760,LastTimestamp:2026-03-17 11:11:44.985107002 +0000 UTC m=+8.111234760,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.050816 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d9c80bdbc0e95 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:45.006808725 +0000 UTC m=+8.132936493,LastTimestamp:2026-03-17 11:11:45.006808725 +0000 UTC m=+8.132936493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.056092 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 11:12:26 crc kubenswrapper[4742]: &Event{ObjectMeta:{kube-controller-manager-crc.189d9c810d6d5257 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 17 11:12:26 crc kubenswrapper[4742]: body: Mar 17 11:12:26 crc kubenswrapper[4742]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:46.343826007 +0000 UTC m=+9.469953795,LastTimestamp:2026-03-17 11:11:46.343826007 +0000 UTC m=+9.469953795,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 11:12:26 crc kubenswrapper[4742]: > Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.062733 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c810d6e7d73 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:46.343902579 +0000 UTC m=+9.470030367,LastTimestamp:2026-03-17 11:11:46.343902579 +0000 UTC m=+9.470030367,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.073061 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d9c80308b730c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c80308b730c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.638035724 +0000 UTC m=+5.764163482,LastTimestamp:2026-03-17 11:11:54.767192844 +0000 UTC m=+17.893320602,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.080002 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d9c8040b7bd4a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c8040b7bd4a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.90937377 +0000 UTC m=+6.035501528,LastTimestamp:2026-03-17 11:11:54.967146255 +0000 UTC m=+18.093274013,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.086865 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d9c804228d68a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c804228d68a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:42.933563018 +0000 UTC m=+6.059690776,LastTimestamp:2026-03-17 11:11:54.980605395 +0000 UTC m=+18.106733153,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.093798 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 17 11:12:26 crc kubenswrapper[4742]: &Event{ObjectMeta:{kube-apiserver-crc.189d9c8332fa96a2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 17 11:12:26 crc kubenswrapper[4742]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 17 11:12:26 crc kubenswrapper[4742]: Mar 17 11:12:26 crc kubenswrapper[4742]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:55.56377565 +0000 UTC m=+18.689903428,LastTimestamp:2026-03-17 11:11:55.56377565 +0000 UTC m=+18.689903428,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 11:12:26 crc kubenswrapper[4742]: > Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.098796 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c8332fd0792 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:55.563935634 +0000 UTC m=+18.690063402,LastTimestamp:2026-03-17 11:11:55.563935634 +0000 UTC m=+18.690063402,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.103496 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d9c8332fa96a2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 17 11:12:26 crc kubenswrapper[4742]: &Event{ObjectMeta:{kube-apiserver-crc.189d9c8332fa96a2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 17 11:12:26 crc kubenswrapper[4742]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 17 11:12:26 crc kubenswrapper[4742]: Mar 17 11:12:26 crc kubenswrapper[4742]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:55.56377565 +0000 UTC m=+18.689903428,LastTimestamp:2026-03-17 11:11:55.568419195 +0000 UTC m=+18.694546953,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 11:12:26 crc kubenswrapper[4742]: > Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.108840 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d9c8332fd0792\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d9c8332fd0792 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:55.563935634 +0000 UTC m=+18.690063402,LastTimestamp:2026-03-17 11:11:55.568472386 +0000 UTC m=+18.694600144,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.113306 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 11:12:26 crc kubenswrapper[4742]: &Event{ObjectMeta:{kube-controller-manager-crc.189d9c836188694a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 17 11:12:26 crc kubenswrapper[4742]: body: Mar 17 11:12:26 crc kubenswrapper[4742]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:56.34482209 +0000 UTC m=+19.470949868,LastTimestamp:2026-03-17 11:11:56.34482209 +0000 UTC m=+19.470949868,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 11:12:26 crc kubenswrapper[4742]: > Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.118329 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c8361892e03 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:56.344872451 +0000 UTC m=+19.471000219,LastTimestamp:2026-03-17 11:11:56.344872451 +0000 UTC m=+19.471000219,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.126125 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d9c836188694a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 11:12:26 crc kubenswrapper[4742]: &Event{ObjectMeta:{kube-controller-manager-crc.189d9c836188694a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 17 11:12:26 crc kubenswrapper[4742]: body: Mar 17 11:12:26 crc kubenswrapper[4742]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:56.34482209 +0000 UTC m=+19.470949868,LastTimestamp:2026-03-17 11:12:06.345806622 +0000 UTC m=+29.471934400,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 11:12:26 crc kubenswrapper[4742]: > Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.132524 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d9c8361892e03\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c8361892e03 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:56.344872451 +0000 UTC m=+19.471000219,LastTimestamp:2026-03-17 11:12:06.345866874 +0000 UTC m=+29.471994642,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.137685 4742 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c85b5ccae3d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:12:06.348516925 +0000 UTC m=+29.474644713,LastTimestamp:2026-03-17 11:12:06.348516925 +0000 UTC m=+29.474644713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.144201 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d9c7fbd28dfce\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7fbd28dfce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:40.702195662 +0000 UTC m=+3.828323460,LastTimestamp:2026-03-17 11:12:06.476539803 +0000 UTC m=+29.602667591,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.150268 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d9c7fd0a562f7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7fd0a562f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.029122807 +0000 UTC m=+4.155250605,LastTimestamp:2026-03-17 11:12:06.711440293 +0000 UTC m=+29.837568051,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.154625 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d9c7fd1bf6dc5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c7fd1bf6dc5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:41.047606725 +0000 UTC m=+4.173734523,LastTimestamp:2026-03-17 11:12:06.721277627 +0000 UTC m=+29.847405405,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.163161 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d9c836188694a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 11:12:26 crc kubenswrapper[4742]: &Event{ObjectMeta:{kube-controller-manager-crc.189d9c836188694a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 17 11:12:26 crc kubenswrapper[4742]: body: Mar 17 11:12:26 crc kubenswrapper[4742]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:56.34482209 +0000 UTC m=+19.470949868,LastTimestamp:2026-03-17 11:12:16.345096177 +0000 UTC m=+39.471223935,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 11:12:26 crc kubenswrapper[4742]: > Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.167726 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d9c8361892e03\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d9c8361892e03 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:56.344872451 +0000 UTC m=+19.471000219,LastTimestamp:2026-03-17 11:12:16.345169299 +0000 UTC m=+39.471297057,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:12:26 crc kubenswrapper[4742]: I0317 11:12:26.344624 4742 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 11:12:26 crc kubenswrapper[4742]: I0317 11:12:26.344724 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 11:12:26 crc kubenswrapper[4742]: E0317 11:12:26.352183 4742 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d9c836188694a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 17 11:12:26 crc kubenswrapper[4742]: &Event{ObjectMeta:{kube-controller-manager-crc.189d9c836188694a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 17 11:12:26 crc kubenswrapper[4742]: body: Mar 17 11:12:26 crc kubenswrapper[4742]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:11:56.34482209 +0000 UTC m=+19.470949868,LastTimestamp:2026-03-17 11:12:26.344698614 +0000 UTC m=+49.470826382,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 17 11:12:26 crc kubenswrapper[4742]: > Mar 17 11:12:26 crc kubenswrapper[4742]: I0317 11:12:26.581625 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:27 crc kubenswrapper[4742]: W0317 11:12:27.200039 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 17 11:12:27 crc kubenswrapper[4742]: E0317 11:12:27.200123 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 17 11:12:27 crc kubenswrapper[4742]: I0317 11:12:27.581701 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:28 crc kubenswrapper[4742]: W0317 11:12:28.560324 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 17 11:12:28 crc kubenswrapper[4742]: E0317 11:12:28.560446 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 17 11:12:28 crc kubenswrapper[4742]: I0317 11:12:28.580882 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:28 crc kubenswrapper[4742]: E0317 11:12:28.860604 4742 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 11:12:29 crc kubenswrapper[4742]: E0317 11:12:29.225653 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 17 11:12:29 crc kubenswrapper[4742]: I0317 11:12:29.582103 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:29 crc kubenswrapper[4742]: I0317 11:12:29.598356 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:29 crc kubenswrapper[4742]: I0317 11:12:29.600985 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:29 crc kubenswrapper[4742]: I0317 11:12:29.601027 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:29 crc kubenswrapper[4742]: I0317 11:12:29.601039 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:29 crc kubenswrapper[4742]: I0317 11:12:29.601074 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:12:29 crc kubenswrapper[4742]: E0317 11:12:29.607898 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 17 11:12:29 crc kubenswrapper[4742]: I0317 11:12:29.662784 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:29 crc kubenswrapper[4742]: I0317 11:12:29.664826 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:29 crc kubenswrapper[4742]: I0317 11:12:29.664930 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:29 crc kubenswrapper[4742]: I0317 11:12:29.664954 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:29 crc kubenswrapper[4742]: I0317 11:12:29.666161 4742 scope.go:117] "RemoveContainer" containerID="a06f4ae84833508b5a46cd3cf142fc9e1ca5de2f9de2bb120142f3a9d9428b7b" Mar 17 11:12:30 crc kubenswrapper[4742]: I0317 11:12:30.580452 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:30 crc kubenswrapper[4742]: I0317 11:12:30.980888 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 17 11:12:30 crc kubenswrapper[4742]: I0317 11:12:30.986889 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200"} Mar 17 11:12:30 crc kubenswrapper[4742]: I0317 11:12:30.987396 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:30 crc kubenswrapper[4742]: I0317 11:12:30.989258 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:30 crc kubenswrapper[4742]: I0317 11:12:30.989326 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:30 crc kubenswrapper[4742]: I0317 11:12:30.989348 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:31 crc kubenswrapper[4742]: I0317 11:12:31.581483 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:31 crc kubenswrapper[4742]: I0317 11:12:31.992694 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 17 11:12:31 crc kubenswrapper[4742]: I0317 11:12:31.993939 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 17 11:12:31 crc kubenswrapper[4742]: I0317 11:12:31.996037 4742 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200" exitCode=255 Mar 17 11:12:31 crc kubenswrapper[4742]: I0317 11:12:31.996103 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200"} Mar 17 11:12:31 crc kubenswrapper[4742]: I0317 11:12:31.996174 4742 scope.go:117] "RemoveContainer" containerID="a06f4ae84833508b5a46cd3cf142fc9e1ca5de2f9de2bb120142f3a9d9428b7b" Mar 17 11:12:31 crc kubenswrapper[4742]: I0317 11:12:31.996371 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:31 crc kubenswrapper[4742]: I0317 11:12:31.997766 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:31 crc kubenswrapper[4742]: I0317 11:12:31.997805 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:31 crc kubenswrapper[4742]: I0317 11:12:31.997823 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:31 crc kubenswrapper[4742]: I0317 11:12:31.998590 4742 scope.go:117] "RemoveContainer" containerID="a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200" Mar 17 11:12:31 crc kubenswrapper[4742]: E0317 11:12:31.998829 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 11:12:32 crc kubenswrapper[4742]: I0317 11:12:32.578376 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:32 crc kubenswrapper[4742]: I0317 11:12:32.595091 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 17 11:12:32 crc kubenswrapper[4742]: I0317 11:12:32.595343 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:32 crc kubenswrapper[4742]: I0317 11:12:32.597179 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:32 crc kubenswrapper[4742]: I0317 11:12:32.597239 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:32 crc kubenswrapper[4742]: I0317 11:12:32.597262 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:33 crc kubenswrapper[4742]: I0317 11:12:33.002061 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 17 11:12:33 crc kubenswrapper[4742]: I0317 11:12:33.579594 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:34 crc kubenswrapper[4742]: I0317 11:12:34.580766 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:35 crc kubenswrapper[4742]: I0317 11:12:35.579436 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:36 crc kubenswrapper[4742]: E0317 11:12:36.235380 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.335869 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.336213 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.338329 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.338399 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.338429 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.339435 4742 scope.go:117] "RemoveContainer" containerID="a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200" Mar 17 11:12:36 crc kubenswrapper[4742]: E0317 11:12:36.339792 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.344530 4742 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.344599 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.344658 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.344818 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.351292 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.351380 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.351409 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.352597 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.352853 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223" gracePeriod=30 Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.502252 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.579708 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.608330 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.610021 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.610111 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.610126 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:36 crc kubenswrapper[4742]: I0317 11:12:36.610154 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:12:36 crc kubenswrapper[4742]: E0317 11:12:36.616017 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.021187 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.022710 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.023166 4742 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223" exitCode=255 Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.023246 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223"} Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.023344 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277"} Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.023386 4742 scope.go:117] "RemoveContainer" containerID="cae7df3f4ea292aace885f0fa3f3c6cdd8b702a84e22dccbb5cc9cd966d07764" Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.023474 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.023496 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.025988 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.026033 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.026092 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.026055 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.026241 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.026267 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.027306 4742 scope.go:117] "RemoveContainer" containerID="a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200" Mar 17 11:12:37 crc kubenswrapper[4742]: E0317 11:12:37.027673 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 11:12:37 crc kubenswrapper[4742]: I0317 11:12:37.580495 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:38 crc kubenswrapper[4742]: I0317 11:12:38.031021 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 17 11:12:38 crc kubenswrapper[4742]: I0317 11:12:38.032651 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:38 crc kubenswrapper[4742]: I0317 11:12:38.034042 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:38 crc kubenswrapper[4742]: I0317 11:12:38.034103 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:38 crc kubenswrapper[4742]: I0317 11:12:38.034130 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:38 crc kubenswrapper[4742]: I0317 11:12:38.580142 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:38 crc kubenswrapper[4742]: E0317 11:12:38.861116 4742 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 11:12:39 crc kubenswrapper[4742]: I0317 11:12:39.578526 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:40 crc kubenswrapper[4742]: I0317 11:12:40.580625 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:41 crc kubenswrapper[4742]: I0317 11:12:41.580163 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:41 crc kubenswrapper[4742]: I0317 11:12:41.927220 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:12:41 crc kubenswrapper[4742]: I0317 11:12:41.928281 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:41 crc kubenswrapper[4742]: I0317 11:12:41.929732 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:41 crc kubenswrapper[4742]: I0317 11:12:41.929771 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:41 crc kubenswrapper[4742]: I0317 11:12:41.929784 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:42 crc kubenswrapper[4742]: I0317 11:12:42.577865 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:43 crc kubenswrapper[4742]: E0317 11:12:43.242259 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 17 11:12:43 crc kubenswrapper[4742]: I0317 11:12:43.343858 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:12:43 crc kubenswrapper[4742]: I0317 11:12:43.344123 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:43 crc kubenswrapper[4742]: I0317 11:12:43.345824 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:43 crc kubenswrapper[4742]: I0317 11:12:43.345875 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:43 crc kubenswrapper[4742]: I0317 11:12:43.345891 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:43 crc kubenswrapper[4742]: I0317 11:12:43.349021 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:12:43 crc kubenswrapper[4742]: I0317 11:12:43.578379 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:43 crc kubenswrapper[4742]: I0317 11:12:43.616845 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:43 crc kubenswrapper[4742]: I0317 11:12:43.618535 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:43 crc kubenswrapper[4742]: I0317 11:12:43.618587 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:43 crc kubenswrapper[4742]: I0317 11:12:43.618600 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:43 crc kubenswrapper[4742]: I0317 11:12:43.618634 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:12:43 crc kubenswrapper[4742]: E0317 11:12:43.622625 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 17 11:12:44 crc kubenswrapper[4742]: I0317 11:12:44.054850 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:44 crc kubenswrapper[4742]: I0317 11:12:44.055951 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:44 crc kubenswrapper[4742]: I0317 11:12:44.056003 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:44 crc kubenswrapper[4742]: I0317 11:12:44.056017 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:44 crc kubenswrapper[4742]: I0317 11:12:44.582868 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:45 crc kubenswrapper[4742]: I0317 11:12:45.579024 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:46 crc kubenswrapper[4742]: I0317 11:12:46.579322 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:47 crc kubenswrapper[4742]: I0317 11:12:47.577132 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:48 crc kubenswrapper[4742]: I0317 11:12:48.578499 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:48 crc kubenswrapper[4742]: I0317 11:12:48.662054 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:48 crc kubenswrapper[4742]: I0317 11:12:48.663587 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:48 crc kubenswrapper[4742]: I0317 11:12:48.663661 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:48 crc kubenswrapper[4742]: I0317 11:12:48.663694 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:48 crc kubenswrapper[4742]: I0317 11:12:48.664493 4742 scope.go:117] "RemoveContainer" containerID="a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200" Mar 17 11:12:48 crc kubenswrapper[4742]: E0317 11:12:48.664715 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 11:12:48 crc kubenswrapper[4742]: E0317 11:12:48.862305 4742 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 11:12:49 crc kubenswrapper[4742]: I0317 11:12:49.577440 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:50 crc kubenswrapper[4742]: E0317 11:12:50.248077 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 17 11:12:50 crc kubenswrapper[4742]: I0317 11:12:50.578893 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:50 crc kubenswrapper[4742]: I0317 11:12:50.623520 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:50 crc kubenswrapper[4742]: I0317 11:12:50.624970 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:50 crc kubenswrapper[4742]: I0317 11:12:50.625004 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:50 crc kubenswrapper[4742]: I0317 11:12:50.625016 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:50 crc kubenswrapper[4742]: I0317 11:12:50.625038 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:12:50 crc kubenswrapper[4742]: E0317 11:12:50.629821 4742 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 17 11:12:51 crc kubenswrapper[4742]: I0317 11:12:51.580043 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:51 crc kubenswrapper[4742]: I0317 11:12:51.933201 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:12:51 crc kubenswrapper[4742]: I0317 11:12:51.933470 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:51 crc kubenswrapper[4742]: I0317 11:12:51.935290 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:51 crc kubenswrapper[4742]: I0317 11:12:51.935341 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:51 crc kubenswrapper[4742]: I0317 11:12:51.935358 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:51 crc kubenswrapper[4742]: I0317 11:12:51.959013 4742 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 17 11:12:51 crc kubenswrapper[4742]: I0317 11:12:51.980156 4742 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 17 11:12:52 crc kubenswrapper[4742]: W0317 11:12:52.298584 4742 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 17 11:12:52 crc kubenswrapper[4742]: E0317 11:12:52.298650 4742 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 17 11:12:52 crc kubenswrapper[4742]: I0317 11:12:52.579816 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:53 crc kubenswrapper[4742]: I0317 11:12:53.578668 4742 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 17 11:12:53 crc kubenswrapper[4742]: I0317 11:12:53.591205 4742 csr.go:261] certificate signing request csr-f9ltv is approved, waiting to be issued Mar 17 11:12:53 crc kubenswrapper[4742]: I0317 11:12:53.606598 4742 csr.go:257] certificate signing request csr-f9ltv is issued Mar 17 11:12:53 crc kubenswrapper[4742]: I0317 11:12:53.717239 4742 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 17 11:12:54 crc kubenswrapper[4742]: I0317 11:12:54.021531 4742 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 17 11:12:54 crc kubenswrapper[4742]: I0317 11:12:54.608377 4742 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-13 10:37:00.930264916 +0000 UTC Mar 17 11:12:54 crc kubenswrapper[4742]: I0317 11:12:54.608439 4742 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7247h24m6.321828995s for next certificate rotation Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.630450 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.632239 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.632317 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.632342 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.632763 4742 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.643062 4742 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.643406 4742 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 17 11:12:57 crc kubenswrapper[4742]: E0317 11:12:57.643432 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.647797 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.647825 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.647842 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.647857 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.647869 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:12:57Z","lastTransitionTime":"2026-03-17T11:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:12:57 crc kubenswrapper[4742]: E0317 11:12:57.662275 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.672662 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.672748 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.672779 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.672811 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.672836 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:12:57Z","lastTransitionTime":"2026-03-17T11:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:12:57 crc kubenswrapper[4742]: E0317 11:12:57.687830 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.696714 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.696782 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.696803 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.696836 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.696857 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:12:57Z","lastTransitionTime":"2026-03-17T11:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:12:57 crc kubenswrapper[4742]: E0317 11:12:57.709866 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.721853 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.721972 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.722000 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.722028 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:12:57 crc kubenswrapper[4742]: I0317 11:12:57.722052 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:12:57Z","lastTransitionTime":"2026-03-17T11:12:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:12:57 crc kubenswrapper[4742]: E0317 11:12:57.735199 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:12:57 crc kubenswrapper[4742]: E0317 11:12:57.735383 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:12:57 crc kubenswrapper[4742]: E0317 11:12:57.735429 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:57 crc kubenswrapper[4742]: E0317 11:12:57.836156 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:57 crc kubenswrapper[4742]: E0317 11:12:57.936843 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:58 crc kubenswrapper[4742]: E0317 11:12:58.037991 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:58 crc kubenswrapper[4742]: E0317 11:12:58.138542 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:58 crc kubenswrapper[4742]: E0317 11:12:58.238769 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:58 crc kubenswrapper[4742]: E0317 11:12:58.339030 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:58 crc kubenswrapper[4742]: E0317 11:12:58.440176 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:58 crc kubenswrapper[4742]: E0317 11:12:58.540438 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:58 crc kubenswrapper[4742]: E0317 11:12:58.641035 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:58 crc kubenswrapper[4742]: E0317 11:12:58.742190 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:58 crc kubenswrapper[4742]: E0317 11:12:58.843104 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:58 crc kubenswrapper[4742]: E0317 11:12:58.862483 4742 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 11:12:58 crc kubenswrapper[4742]: E0317 11:12:58.944034 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:59 crc kubenswrapper[4742]: E0317 11:12:59.044756 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:59 crc kubenswrapper[4742]: E0317 11:12:59.145409 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:59 crc kubenswrapper[4742]: E0317 11:12:59.245942 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:59 crc kubenswrapper[4742]: E0317 11:12:59.346991 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:59 crc kubenswrapper[4742]: E0317 11:12:59.448119 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:59 crc kubenswrapper[4742]: E0317 11:12:59.548502 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:59 crc kubenswrapper[4742]: E0317 11:12:59.649246 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:59 crc kubenswrapper[4742]: E0317 11:12:59.749657 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:59 crc kubenswrapper[4742]: E0317 11:12:59.850640 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:12:59 crc kubenswrapper[4742]: E0317 11:12:59.951764 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:00 crc kubenswrapper[4742]: E0317 11:13:00.051972 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:00 crc kubenswrapper[4742]: E0317 11:13:00.152765 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:00 crc kubenswrapper[4742]: E0317 11:13:00.253858 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:00 crc kubenswrapper[4742]: E0317 11:13:00.354885 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:00 crc kubenswrapper[4742]: E0317 11:13:00.455135 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:00 crc kubenswrapper[4742]: E0317 11:13:00.555957 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:00 crc kubenswrapper[4742]: E0317 11:13:00.656457 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:00 crc kubenswrapper[4742]: E0317 11:13:00.757116 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:00 crc kubenswrapper[4742]: E0317 11:13:00.857341 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:00 crc kubenswrapper[4742]: E0317 11:13:00.957901 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:01 crc kubenswrapper[4742]: E0317 11:13:01.058605 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:01 crc kubenswrapper[4742]: E0317 11:13:01.159396 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:01 crc kubenswrapper[4742]: E0317 11:13:01.260585 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:01 crc kubenswrapper[4742]: E0317 11:13:01.361780 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:01 crc kubenswrapper[4742]: E0317 11:13:01.462851 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:01 crc kubenswrapper[4742]: E0317 11:13:01.563049 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:01 crc kubenswrapper[4742]: E0317 11:13:01.663734 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:01 crc kubenswrapper[4742]: E0317 11:13:01.764782 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:01 crc kubenswrapper[4742]: E0317 11:13:01.866319 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:01 crc kubenswrapper[4742]: E0317 11:13:01.967009 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:02 crc kubenswrapper[4742]: E0317 11:13:02.068362 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:02 crc kubenswrapper[4742]: E0317 11:13:02.169111 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:02 crc kubenswrapper[4742]: E0317 11:13:02.270447 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:02 crc kubenswrapper[4742]: E0317 11:13:02.371572 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:02 crc kubenswrapper[4742]: E0317 11:13:02.472506 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:02 crc kubenswrapper[4742]: E0317 11:13:02.573169 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:02 crc kubenswrapper[4742]: I0317 11:13:02.662259 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:13:02 crc kubenswrapper[4742]: I0317 11:13:02.663687 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:02 crc kubenswrapper[4742]: I0317 11:13:02.663718 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:02 crc kubenswrapper[4742]: I0317 11:13:02.663727 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:02 crc kubenswrapper[4742]: I0317 11:13:02.664546 4742 scope.go:117] "RemoveContainer" containerID="a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200" Mar 17 11:13:02 crc kubenswrapper[4742]: E0317 11:13:02.664735 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 17 11:13:02 crc kubenswrapper[4742]: E0317 11:13:02.674158 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:02 crc kubenswrapper[4742]: E0317 11:13:02.775136 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:02 crc kubenswrapper[4742]: E0317 11:13:02.875443 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:02 crc kubenswrapper[4742]: E0317 11:13:02.976329 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:03 crc kubenswrapper[4742]: E0317 11:13:03.076813 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:03 crc kubenswrapper[4742]: E0317 11:13:03.177649 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:03 crc kubenswrapper[4742]: E0317 11:13:03.278836 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:03 crc kubenswrapper[4742]: E0317 11:13:03.379284 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:03 crc kubenswrapper[4742]: E0317 11:13:03.480415 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:03 crc kubenswrapper[4742]: E0317 11:13:03.581122 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:03 crc kubenswrapper[4742]: E0317 11:13:03.682217 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:03 crc kubenswrapper[4742]: E0317 11:13:03.783266 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:03 crc kubenswrapper[4742]: E0317 11:13:03.884357 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:03 crc kubenswrapper[4742]: E0317 11:13:03.984511 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:04 crc kubenswrapper[4742]: E0317 11:13:04.084855 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:04 crc kubenswrapper[4742]: E0317 11:13:04.185543 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:04 crc kubenswrapper[4742]: E0317 11:13:04.286335 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:04 crc kubenswrapper[4742]: E0317 11:13:04.387396 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:04 crc kubenswrapper[4742]: E0317 11:13:04.488143 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:04 crc kubenswrapper[4742]: E0317 11:13:04.588954 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:04 crc kubenswrapper[4742]: E0317 11:13:04.689165 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:04 crc kubenswrapper[4742]: E0317 11:13:04.790321 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:04 crc kubenswrapper[4742]: E0317 11:13:04.891230 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:04 crc kubenswrapper[4742]: E0317 11:13:04.992198 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:05 crc kubenswrapper[4742]: E0317 11:13:05.093044 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:05 crc kubenswrapper[4742]: E0317 11:13:05.193800 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:05 crc kubenswrapper[4742]: E0317 11:13:05.294207 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:05 crc kubenswrapper[4742]: E0317 11:13:05.394591 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:05 crc kubenswrapper[4742]: E0317 11:13:05.495081 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:05 crc kubenswrapper[4742]: E0317 11:13:05.596039 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:05 crc kubenswrapper[4742]: E0317 11:13:05.696698 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:05 crc kubenswrapper[4742]: E0317 11:13:05.797633 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:05 crc kubenswrapper[4742]: E0317 11:13:05.898608 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:05 crc kubenswrapper[4742]: E0317 11:13:05.999773 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:06 crc kubenswrapper[4742]: E0317 11:13:06.101038 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:06 crc kubenswrapper[4742]: E0317 11:13:06.201762 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:06 crc kubenswrapper[4742]: E0317 11:13:06.302548 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:06 crc kubenswrapper[4742]: E0317 11:13:06.402766 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:06 crc kubenswrapper[4742]: E0317 11:13:06.502996 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:06 crc kubenswrapper[4742]: E0317 11:13:06.603845 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:06 crc kubenswrapper[4742]: E0317 11:13:06.705072 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:06 crc kubenswrapper[4742]: E0317 11:13:06.806143 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:06 crc kubenswrapper[4742]: E0317 11:13:06.907073 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:07 crc kubenswrapper[4742]: E0317 11:13:07.007477 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:07 crc kubenswrapper[4742]: E0317 11:13:07.108491 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:07 crc kubenswrapper[4742]: E0317 11:13:07.209591 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:07 crc kubenswrapper[4742]: E0317 11:13:07.309693 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:07 crc kubenswrapper[4742]: E0317 11:13:07.411135 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:07 crc kubenswrapper[4742]: E0317 11:13:07.511237 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:07 crc kubenswrapper[4742]: E0317 11:13:07.611779 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:07 crc kubenswrapper[4742]: E0317 11:13:07.712183 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:07 crc kubenswrapper[4742]: E0317 11:13:07.813229 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:07 crc kubenswrapper[4742]: E0317 11:13:07.913739 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.014708 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.024191 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.030376 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.030465 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.030504 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.030537 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.030561 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:08Z","lastTransitionTime":"2026-03-17T11:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.042589 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.053852 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.053988 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.054028 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.054063 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.054088 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:08Z","lastTransitionTime":"2026-03-17T11:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.072136 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.082470 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.082557 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.082586 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.082619 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.082649 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:08Z","lastTransitionTime":"2026-03-17T11:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.099461 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.108622 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.108714 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.108736 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.108766 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:08 crc kubenswrapper[4742]: I0317 11:13:08.108788 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:08Z","lastTransitionTime":"2026-03-17T11:13:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.122073 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.122240 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.122276 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.223100 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.323733 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.424582 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.525402 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.626075 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.726220 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.827545 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.863770 4742 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 17 11:13:08 crc kubenswrapper[4742]: E0317 11:13:08.928144 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:09 crc kubenswrapper[4742]: E0317 11:13:09.028858 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:09 crc kubenswrapper[4742]: E0317 11:13:09.129696 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:09 crc kubenswrapper[4742]: E0317 11:13:09.230695 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:09 crc kubenswrapper[4742]: E0317 11:13:09.331225 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:09 crc kubenswrapper[4742]: E0317 11:13:09.432356 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:09 crc kubenswrapper[4742]: E0317 11:13:09.532718 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:09 crc kubenswrapper[4742]: E0317 11:13:09.633756 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:09 crc kubenswrapper[4742]: I0317 11:13:09.663164 4742 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 17 11:13:09 crc kubenswrapper[4742]: I0317 11:13:09.665577 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:09 crc kubenswrapper[4742]: I0317 11:13:09.665685 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:09 crc kubenswrapper[4742]: I0317 11:13:09.665698 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:09 crc kubenswrapper[4742]: E0317 11:13:09.734980 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:09 crc kubenswrapper[4742]: E0317 11:13:09.836016 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:09 crc kubenswrapper[4742]: E0317 11:13:09.937165 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:10 crc kubenswrapper[4742]: E0317 11:13:10.037488 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:10 crc kubenswrapper[4742]: E0317 11:13:10.137740 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:10 crc kubenswrapper[4742]: E0317 11:13:10.238823 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:10 crc kubenswrapper[4742]: E0317 11:13:10.340203 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:10 crc kubenswrapper[4742]: E0317 11:13:10.441392 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:10 crc kubenswrapper[4742]: E0317 11:13:10.542545 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:10 crc kubenswrapper[4742]: E0317 11:13:10.643104 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:10 crc kubenswrapper[4742]: E0317 11:13:10.743596 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:10 crc kubenswrapper[4742]: E0317 11:13:10.845000 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:10 crc kubenswrapper[4742]: E0317 11:13:10.945431 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:11 crc kubenswrapper[4742]: E0317 11:13:11.047059 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:11 crc kubenswrapper[4742]: E0317 11:13:11.147182 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:11 crc kubenswrapper[4742]: E0317 11:13:11.247601 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:11 crc kubenswrapper[4742]: E0317 11:13:11.348084 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:11 crc kubenswrapper[4742]: E0317 11:13:11.448584 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:11 crc kubenswrapper[4742]: E0317 11:13:11.549488 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:11 crc kubenswrapper[4742]: E0317 11:13:11.650284 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:11 crc kubenswrapper[4742]: E0317 11:13:11.751141 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:11 crc kubenswrapper[4742]: E0317 11:13:11.851372 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:11 crc kubenswrapper[4742]: E0317 11:13:11.951859 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:12 crc kubenswrapper[4742]: E0317 11:13:12.052943 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:12 crc kubenswrapper[4742]: E0317 11:13:12.153945 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:12 crc kubenswrapper[4742]: E0317 11:13:12.254149 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:12 crc kubenswrapper[4742]: E0317 11:13:12.354821 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:12 crc kubenswrapper[4742]: I0317 11:13:12.412336 4742 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 17 11:13:12 crc kubenswrapper[4742]: E0317 11:13:12.455501 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:12 crc kubenswrapper[4742]: E0317 11:13:12.555946 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:12 crc kubenswrapper[4742]: E0317 11:13:12.656354 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:12 crc kubenswrapper[4742]: E0317 11:13:12.756977 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:12 crc kubenswrapper[4742]: E0317 11:13:12.858056 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:12 crc kubenswrapper[4742]: E0317 11:13:12.959104 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:13 crc kubenswrapper[4742]: E0317 11:13:13.059587 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:13 crc kubenswrapper[4742]: E0317 11:13:13.160537 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:13 crc kubenswrapper[4742]: E0317 11:13:13.261629 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:13 crc kubenswrapper[4742]: E0317 11:13:13.362797 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:13 crc kubenswrapper[4742]: E0317 11:13:13.463739 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:13 crc kubenswrapper[4742]: E0317 11:13:13.564601 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:13 crc kubenswrapper[4742]: E0317 11:13:13.665713 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:13 crc kubenswrapper[4742]: E0317 11:13:13.766456 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:13 crc kubenswrapper[4742]: E0317 11:13:13.867710 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:13 crc kubenswrapper[4742]: E0317 11:13:13.968539 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:14 crc kubenswrapper[4742]: E0317 11:13:14.069130 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:14 crc kubenswrapper[4742]: E0317 11:13:14.169625 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:14 crc kubenswrapper[4742]: E0317 11:13:14.270185 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:14 crc kubenswrapper[4742]: E0317 11:13:14.370363 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:14 crc kubenswrapper[4742]: E0317 11:13:14.471677 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:14 crc kubenswrapper[4742]: E0317 11:13:14.572236 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:14 crc kubenswrapper[4742]: E0317 11:13:14.673731 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:14 crc kubenswrapper[4742]: E0317 11:13:14.775158 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:14 crc kubenswrapper[4742]: E0317 11:13:14.876176 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:14 crc kubenswrapper[4742]: E0317 11:13:14.976850 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:15 crc kubenswrapper[4742]: E0317 11:13:15.077861 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:15 crc kubenswrapper[4742]: E0317 11:13:15.178661 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:15 crc kubenswrapper[4742]: E0317 11:13:15.279755 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:15 crc kubenswrapper[4742]: E0317 11:13:15.380669 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:15 crc kubenswrapper[4742]: E0317 11:13:15.481426 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:15 crc kubenswrapper[4742]: E0317 11:13:15.581789 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:15 crc kubenswrapper[4742]: E0317 11:13:15.682367 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:15 crc kubenswrapper[4742]: E0317 11:13:15.782926 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:15 crc kubenswrapper[4742]: E0317 11:13:15.883082 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:15 crc kubenswrapper[4742]: E0317 11:13:15.983583 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:16 crc kubenswrapper[4742]: E0317 11:13:16.084443 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:16 crc kubenswrapper[4742]: E0317 11:13:16.184669 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:16 crc kubenswrapper[4742]: E0317 11:13:16.285713 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:16 crc kubenswrapper[4742]: E0317 11:13:16.386228 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:16 crc kubenswrapper[4742]: E0317 11:13:16.487234 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:16 crc kubenswrapper[4742]: E0317 11:13:16.588070 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:16 crc kubenswrapper[4742]: E0317 11:13:16.688278 4742 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 17 11:13:16 crc kubenswrapper[4742]: I0317 11:13:16.738684 4742 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 17 11:13:16 crc kubenswrapper[4742]: I0317 11:13:16.791574 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:16 crc kubenswrapper[4742]: I0317 11:13:16.792381 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:16 crc kubenswrapper[4742]: I0317 11:13:16.792483 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:16 crc kubenswrapper[4742]: I0317 11:13:16.792600 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:16 crc kubenswrapper[4742]: I0317 11:13:16.792700 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:16Z","lastTransitionTime":"2026-03-17T11:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:16 crc kubenswrapper[4742]: I0317 11:13:16.896703 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:16 crc kubenswrapper[4742]: I0317 11:13:16.897139 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:16 crc kubenswrapper[4742]: I0317 11:13:16.897242 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:16 crc kubenswrapper[4742]: I0317 11:13:16.897346 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:16 crc kubenswrapper[4742]: I0317 11:13:16.897452 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:16Z","lastTransitionTime":"2026-03-17T11:13:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.001199 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.001274 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.001293 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.001322 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.001345 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:17Z","lastTransitionTime":"2026-03-17T11:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.105366 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.105426 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.105443 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.105472 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.105527 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:17Z","lastTransitionTime":"2026-03-17T11:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.208966 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.209043 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.209064 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.209097 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.209121 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:17Z","lastTransitionTime":"2026-03-17T11:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.312781 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.312842 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.312860 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.312885 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.312902 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:17Z","lastTransitionTime":"2026-03-17T11:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.415718 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.415791 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.415811 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.415843 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.415861 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:17Z","lastTransitionTime":"2026-03-17T11:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.519485 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.519578 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.519610 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.519644 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.519666 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:17Z","lastTransitionTime":"2026-03-17T11:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.599875 4742 apiserver.go:52] "Watching apiserver" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.607592 4742 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.608249 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/iptables-alerter-4ln5h","openshift-dns/node-resolver-kwrj5","openshift-machine-config-operator/machine-config-daemon-5jxxw","openshift-multus/multus-additional-cni-plugins-hcxv8","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-zwfsr","openshift-multus/multus-xwmfc","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.608890 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.609167 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.609270 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.609350 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.609412 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.609662 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.609870 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.609977 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.610179 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.611636 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kwrj5" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.612186 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.612393 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.612309 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.612622 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.614655 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.615597 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.617830 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.619116 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.620330 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.621023 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.621361 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.621474 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.621600 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.621619 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.621741 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.621755 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.621981 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.622464 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.623178 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.623375 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.624059 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.624168 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.624542 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.624884 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.624982 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.625172 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.625315 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.625516 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.625628 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.625717 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.625534 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.625945 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.625340 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.626291 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.627229 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.631627 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.631691 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.631714 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.631755 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.631779 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:17Z","lastTransitionTime":"2026-03-17T11:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.657527 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.690021 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.699932 4742 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.711632 4742 scope.go:117] "RemoveContainer" containerID="a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.711699 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.711764 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.712183 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.724234 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.736528 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.736575 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.736587 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.736607 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.736619 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:17Z","lastTransitionTime":"2026-03-17T11:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.737955 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.752375 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.768178 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780471 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780517 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780544 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780568 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780586 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780601 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780620 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780639 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780660 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780680 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780704 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780725 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780741 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780764 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780786 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780810 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780830 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780849 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780873 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780900 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780938 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.780980 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781008 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781032 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781062 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781089 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781113 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781141 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781166 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781190 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781214 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781245 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781271 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781416 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781399 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781451 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781481 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781505 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781535 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781561 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781591 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781619 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781644 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781670 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781697 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781721 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781725 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781818 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781855 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781881 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781920 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781948 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.781974 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782033 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782057 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782080 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782103 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782133 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782157 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782185 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782207 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782250 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782272 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782271 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782298 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782325 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782345 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782363 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782380 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782400 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782420 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782439 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782459 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782478 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782500 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782519 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782537 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782558 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782578 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782600 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782622 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782643 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782660 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782679 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782695 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782711 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782730 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782749 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782766 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782784 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782803 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782820 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782838 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782861 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782888 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782926 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782950 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782968 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782985 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783001 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.782994 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783019 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783037 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783056 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783075 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783096 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783114 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783134 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783156 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783176 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783195 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783214 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783234 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783256 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783273 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783446 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783647 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783664 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783683 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783700 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783720 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783739 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783761 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783849 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783882 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783929 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783955 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783976 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783998 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.784019 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.784039 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.784061 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785310 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785356 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785452 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785476 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785500 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785526 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785553 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785580 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785609 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785636 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785660 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785686 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785719 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785745 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785771 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785798 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785821 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785843 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785868 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785893 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785936 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785961 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785985 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786014 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786043 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786070 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786096 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786120 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786147 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786172 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786197 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786251 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786276 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786301 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786330 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786358 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786383 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786407 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786436 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786463 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786527 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.788187 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783146 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783311 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.783957 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.784175 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.784187 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.784349 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.784557 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.784589 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.784645 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.784794 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.784770 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.784837 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.784924 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785206 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785217 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785250 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785276 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.794118 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.794265 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785567 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785592 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785617 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785692 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785718 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.785920 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786021 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786089 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786272 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786300 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786442 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.786531 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:13:18.286502089 +0000 UTC m=+101.412629857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.794516 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.794554 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.794579 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.794601 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.794621 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.794643 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.794641 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.794663 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.794971 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.794973 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786888 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.787041 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.786932 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.787363 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.787342 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.787430 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.787454 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.787694 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.787863 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.795156 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.787856 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.788512 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.788370 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.788628 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.788657 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.788860 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.788840 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.789578 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.789597 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.789633 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.789849 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.790402 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.790338 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.790853 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.791040 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.791064 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.791527 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.791816 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.791937 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.792278 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.792314 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.792332 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.792343 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.792350 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.792699 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.792960 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.792443 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.793205 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.793235 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.793399 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.793459 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.793730 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.793767 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.793816 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.794065 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.795006 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.795344 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.795454 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.795461 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.796255 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.796451 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.796561 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.796948 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.797561 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.797819 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.797950 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798180 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798239 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.795032 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798276 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798285 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798385 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798414 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798452 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798480 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798511 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798535 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798563 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798460 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798586 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798611 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798636 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798686 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798735 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.798867 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799032 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799063 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799076 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799100 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799323 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799349 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799377 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799497 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-var-lib-cni-multus\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799526 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-multus-daemon-config\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799662 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799707 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a0932050-dced-4c05-b9d2-d8db1db0dceb-cni-binary-copy\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799710 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799737 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-var-lib-openvswitch\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799843 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799846 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-cni-bin\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799898 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e11ad39-38bb-4b70-9cac-ce078b37f882-proxy-tls\") pod \"machine-config-daemon-5jxxw\" (UID: \"5e11ad39-38bb-4b70-9cac-ce078b37f882\") " pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799944 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-log-socket\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.799973 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-var-lib-cni-bin\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.800003 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2qj9\" (UniqueName: \"kubernetes.io/projected/a0932050-dced-4c05-b9d2-d8db1db0dceb-kube-api-access-r2qj9\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.800036 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.800059 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a0932050-dced-4c05-b9d2-d8db1db0dceb-cnibin\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.800074 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.800081 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-node-log\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.800130 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w98f\" (UniqueName: \"kubernetes.io/projected/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-kube-api-access-4w98f\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.800198 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.800200 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.800224 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmpzl\" (UniqueName: \"kubernetes.io/projected/5e11ad39-38bb-4b70-9cac-ce078b37f882-kube-api-access-vmpzl\") pod \"machine-config-daemon-5jxxw\" (UID: \"5e11ad39-38bb-4b70-9cac-ce078b37f882\") " pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.800247 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.800267 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-cni-binary-copy\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.800987 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.800998 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.801018 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.801130 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.801138 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.802471 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.801076 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.803097 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.803263 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.803369 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.804002 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.804187 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.804339 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-run-multus-certs\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.804442 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.804484 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-slash\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.804514 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-etc-openvswitch\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.804704 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.804781 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovn-node-metrics-cert\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.804942 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.805191 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.805297 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-run-k8s-cni-cncf-io\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.805167 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.805401 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.805577 4742 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.805644 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:18.305624428 +0000 UTC m=+101.431752186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.805980 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.806121 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-os-release\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.806201 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-env-overrides\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.806320 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.806450 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e11ad39-38bb-4b70-9cac-ce078b37f882-mcd-auth-proxy-config\") pod \"machine-config-daemon-5jxxw\" (UID: \"5e11ad39-38bb-4b70-9cac-ce078b37f882\") " pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.806550 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.806642 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-cnibin\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.806711 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-multus-socket-dir-parent\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.806783 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-var-lib-kubelet\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.806855 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.806949 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-systemd\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.807047 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.807164 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5e11ad39-38bb-4b70-9cac-ce078b37f882-rootfs\") pod \"machine-config-daemon-5jxxw\" (UID: \"5e11ad39-38bb-4b70-9cac-ce078b37f882\") " pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.807790 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.807077 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.807503 4742 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.807978 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-run-netns\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.808234 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-cni-netd\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.808315 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.808380 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-run-netns\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.808449 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-hostroot\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.808513 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a0932050-dced-4c05-b9d2-d8db1db0dceb-os-release\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.808557 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.808590 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a0932050-dced-4c05-b9d2-d8db1db0dceb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.808773 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovnkube-config\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.808860 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovnkube-script-lib\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.809050 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-multus-conf-dir\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.809147 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkjp8\" (UniqueName: \"kubernetes.io/projected/d021cdee-f700-4a5f-a62e-be4acbb8c62e-kube-api-access-qkjp8\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.809295 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa31fa5e-119d-4392-b5c6-8f4a488e64af-hosts-file\") pod \"node-resolver-kwrj5\" (UID: \"fa31fa5e-119d-4392-b5c6-8f4a488e64af\") " pod="openshift-dns/node-resolver-kwrj5" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.809397 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.809555 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w4nh\" (UniqueName: \"kubernetes.io/projected/fa31fa5e-119d-4392-b5c6-8f4a488e64af-kube-api-access-5w4nh\") pod \"node-resolver-kwrj5\" (UID: \"fa31fa5e-119d-4392-b5c6-8f4a488e64af\") " pod="openshift-dns/node-resolver-kwrj5" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.809651 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-system-cni-dir\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.809698 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.809851 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.809885 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.809792 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-multus-cni-dir\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810059 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810079 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810109 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810071 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810193 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810285 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0932050-dced-4c05-b9d2-d8db1db0dceb-system-cni-dir\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.810299 4742 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810327 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a0932050-dced-4c05-b9d2-d8db1db0dceb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.810378 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:18.310356679 +0000 UTC m=+101.436484437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810405 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-kubelet\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810446 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-systemd-units\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810494 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810531 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-etc-kubernetes\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810573 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-openvswitch\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810611 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-ovn\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810785 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810826 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810849 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810872 4742 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810897 4742 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810949 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810973 4742 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810985 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.810994 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811038 4742 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811057 4742 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811073 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811088 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811105 4742 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811119 4742 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811135 4742 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811149 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811166 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811181 4742 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811195 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811209 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811224 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811238 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811253 4742 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811266 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811282 4742 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811298 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811316 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811331 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811345 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811358 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811362 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811372 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811422 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811446 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811472 4742 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811493 4742 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811515 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811535 4742 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811555 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811580 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811602 4742 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811625 4742 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811648 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811669 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811691 4742 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811716 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811736 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811757 4742 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811776 4742 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811797 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811817 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811836 4742 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811855 4742 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811879 4742 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811899 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811949 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811968 4742 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.811987 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812007 4742 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812014 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812026 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812266 4742 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812288 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812304 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812319 4742 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812333 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812347 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812363 4742 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812378 4742 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812392 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812407 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812422 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812437 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812450 4742 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812446 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812465 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812479 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812493 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812507 4742 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812521 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812534 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812547 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812560 4742 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812573 4742 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812586 4742 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812599 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812613 4742 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812626 4742 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812641 4742 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812655 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812669 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812681 4742 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812694 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812707 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812720 4742 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812734 4742 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812747 4742 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812758 4742 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812770 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812784 4742 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812797 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812810 4742 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812823 4742 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812836 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812849 4742 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812861 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812876 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812889 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812901 4742 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812938 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812952 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812966 4742 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812978 4742 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.812991 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813004 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813017 4742 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813032 4742 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813045 4742 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813059 4742 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813073 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813086 4742 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813035 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813101 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813155 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813180 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813203 4742 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813225 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813249 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813271 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.813122 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.814482 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.814597 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.814616 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.814630 4742 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.814667 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.814683 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:18.314671329 +0000 UTC m=+101.440799287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.819096 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.819853 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.820143 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.820184 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.820207 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.820224 4742 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.820285 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:18.320266694 +0000 UTC m=+101.446394562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.820279 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.822663 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.823468 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.824254 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.824850 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.821575 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.829451 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.834653 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.836227 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.837509 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.837706 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.837853 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.837899 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.837964 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.838085 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.838017 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.838031 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.838419 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.838608 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.838660 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.838613 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.839268 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.839395 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.839488 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.839764 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.839822 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.839865 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.840045 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.840094 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.840258 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.839596 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.840439 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.840703 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.840898 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.842072 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.842111 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.843405 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.843438 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.843454 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.843476 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.843490 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:17Z","lastTransitionTime":"2026-03-17T11:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.843694 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.843963 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.844060 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.844419 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.844633 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.845052 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.845172 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.845600 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.845763 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.847113 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.847309 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.847575 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.847649 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.847657 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.847622 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.848024 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.848111 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.848219 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.848265 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.848500 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.848882 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.848923 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.849070 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.850030 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.858797 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.861329 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.862775 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.862972 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.872414 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.881099 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.889834 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914268 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-run-netns\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914301 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-hostroot\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914319 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a0932050-dced-4c05-b9d2-d8db1db0dceb-os-release\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914334 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a0932050-dced-4c05-b9d2-d8db1db0dceb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914350 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovnkube-config\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914367 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovnkube-script-lib\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914381 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-multus-conf-dir\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914394 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkjp8\" (UniqueName: \"kubernetes.io/projected/d021cdee-f700-4a5f-a62e-be4acbb8c62e-kube-api-access-qkjp8\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914410 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa31fa5e-119d-4392-b5c6-8f4a488e64af-hosts-file\") pod \"node-resolver-kwrj5\" (UID: \"fa31fa5e-119d-4392-b5c6-8f4a488e64af\") " pod="openshift-dns/node-resolver-kwrj5" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914424 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w4nh\" (UniqueName: \"kubernetes.io/projected/fa31fa5e-119d-4392-b5c6-8f4a488e64af-kube-api-access-5w4nh\") pod \"node-resolver-kwrj5\" (UID: \"fa31fa5e-119d-4392-b5c6-8f4a488e64af\") " pod="openshift-dns/node-resolver-kwrj5" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914442 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-system-cni-dir\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914457 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-multus-cni-dir\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914475 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914497 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0932050-dced-4c05-b9d2-d8db1db0dceb-system-cni-dir\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914511 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a0932050-dced-4c05-b9d2-d8db1db0dceb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914525 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-kubelet\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914542 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-systemd-units\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914557 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-etc-kubernetes\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914572 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-openvswitch\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914589 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-ovn\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914603 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-var-lib-cni-multus\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914617 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-multus-daemon-config\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914631 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a0932050-dced-4c05-b9d2-d8db1db0dceb-cni-binary-copy\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914656 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-var-lib-openvswitch\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914670 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-cni-bin\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914702 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e11ad39-38bb-4b70-9cac-ce078b37f882-proxy-tls\") pod \"machine-config-daemon-5jxxw\" (UID: \"5e11ad39-38bb-4b70-9cac-ce078b37f882\") " pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914720 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-log-socket\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914740 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-var-lib-cni-bin\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914755 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2qj9\" (UniqueName: \"kubernetes.io/projected/a0932050-dced-4c05-b9d2-d8db1db0dceb-kube-api-access-r2qj9\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914781 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a0932050-dced-4c05-b9d2-d8db1db0dceb-cnibin\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914795 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-node-log\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914812 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w98f\" (UniqueName: \"kubernetes.io/projected/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-kube-api-access-4w98f\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914831 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmpzl\" (UniqueName: \"kubernetes.io/projected/5e11ad39-38bb-4b70-9cac-ce078b37f882-kube-api-access-vmpzl\") pod \"machine-config-daemon-5jxxw\" (UID: \"5e11ad39-38bb-4b70-9cac-ce078b37f882\") " pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914846 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-cni-binary-copy\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914860 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-run-multus-certs\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914884 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-slash\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914925 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-etc-openvswitch\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914943 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovn-node-metrics-cert\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914959 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-run-k8s-cni-cncf-io\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914981 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-os-release\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.914999 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-env-overrides\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915014 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e11ad39-38bb-4b70-9cac-ce078b37f882-mcd-auth-proxy-config\") pod \"machine-config-daemon-5jxxw\" (UID: \"5e11ad39-38bb-4b70-9cac-ce078b37f882\") " pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915030 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-cnibin\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915051 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-multus-socket-dir-parent\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915057 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a0932050-dced-4c05-b9d2-d8db1db0dceb-cnibin\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915068 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-openvswitch\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915091 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa31fa5e-119d-4392-b5c6-8f4a488e64af-hosts-file\") pod \"node-resolver-kwrj5\" (UID: \"fa31fa5e-119d-4392-b5c6-8f4a488e64af\") " pod="openshift-dns/node-resolver-kwrj5" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915107 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-node-log\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915070 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-var-lib-kubelet\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915141 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-var-lib-cni-multus\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915147 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915174 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-run-netns\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915188 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-systemd\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915211 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-hostroot\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915212 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5e11ad39-38bb-4b70-9cac-ce078b37f882-rootfs\") pod \"machine-config-daemon-5jxxw\" (UID: \"5e11ad39-38bb-4b70-9cac-ce078b37f882\") " pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915235 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/5e11ad39-38bb-4b70-9cac-ce078b37f882-rootfs\") pod \"machine-config-daemon-5jxxw\" (UID: \"5e11ad39-38bb-4b70-9cac-ce078b37f882\") " pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915246 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915270 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-run-netns\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915290 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-cni-netd\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915314 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915395 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915399 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-system-cni-dir\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915411 4742 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915427 4742 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915440 4742 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915452 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-multus-cni-dir\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915478 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915508 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915520 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a0932050-dced-4c05-b9d2-d8db1db0dceb-os-release\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915116 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-ovn\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915091 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-var-lib-kubelet\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915452 4742 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.915957 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-multus-daemon-config\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.916026 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.916060 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-run-k8s-cni-cncf-io\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.916522 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-slash\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.916575 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-run-multus-certs\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.916637 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-cni-binary-copy\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.916665 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-os-release\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.916687 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-multus-conf-dir\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.916672 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-var-lib-openvswitch\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.916759 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-cni-netd\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.916800 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.916821 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a0932050-dced-4c05-b9d2-d8db1db0dceb-system-cni-dir\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.916841 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-run-netns\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.916866 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-kubelet\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.917153 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-env-overrides\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.917188 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-cni-bin\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.917204 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-log-socket\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.917224 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-systemd\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.917244 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-etc-kubernetes\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.917274 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-etc-openvswitch\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.917303 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-cnibin\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.917641 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovnkube-config\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.917745 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-host-var-lib-cni-bin\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.917840 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a0932050-dced-4c05-b9d2-d8db1db0dceb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.917880 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e11ad39-38bb-4b70-9cac-ce078b37f882-mcd-auth-proxy-config\") pod \"machine-config-daemon-5jxxw\" (UID: \"5e11ad39-38bb-4b70-9cac-ce078b37f882\") " pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.917927 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-multus-socket-dir-parent\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918008 4742 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918021 4742 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918032 4742 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918042 4742 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918053 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918062 4742 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918077 4742 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918087 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918085 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a0932050-dced-4c05-b9d2-d8db1db0dceb-cni-binary-copy\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918097 4742 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918206 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-systemd-units\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918263 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918276 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918288 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918299 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918309 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918319 4742 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918331 4742 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918341 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918350 4742 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918361 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918371 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918380 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918389 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918399 4742 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918407 4742 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918417 4742 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918426 4742 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918435 4742 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918444 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918453 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918462 4742 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918470 4742 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918488 4742 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918526 4742 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918535 4742 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918552 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918560 4742 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918570 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918578 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918589 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918597 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918605 4742 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918615 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918623 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918633 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918642 4742 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918654 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918664 4742 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918673 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918682 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918691 4742 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918699 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918708 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918718 4742 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918726 4742 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918736 4742 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918751 4742 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918760 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918768 4742 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918777 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918789 4742 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918797 4742 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918804 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918813 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918822 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.918949 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovnkube-script-lib\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.919528 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a0932050-dced-4c05-b9d2-d8db1db0dceb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.921013 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e11ad39-38bb-4b70-9cac-ce078b37f882-proxy-tls\") pod \"machine-config-daemon-5jxxw\" (UID: \"5e11ad39-38bb-4b70-9cac-ce078b37f882\") " pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.927662 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovn-node-metrics-cert\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.932374 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmpzl\" (UniqueName: \"kubernetes.io/projected/5e11ad39-38bb-4b70-9cac-ce078b37f882-kube-api-access-vmpzl\") pod \"machine-config-daemon-5jxxw\" (UID: \"5e11ad39-38bb-4b70-9cac-ce078b37f882\") " pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.933860 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w4nh\" (UniqueName: \"kubernetes.io/projected/fa31fa5e-119d-4392-b5c6-8f4a488e64af-kube-api-access-5w4nh\") pod \"node-resolver-kwrj5\" (UID: \"fa31fa5e-119d-4392-b5c6-8f4a488e64af\") " pod="openshift-dns/node-resolver-kwrj5" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.934494 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w98f\" (UniqueName: \"kubernetes.io/projected/ff1068ee-5ebe-4575-806d-967a3b9bfb6a-kube-api-access-4w98f\") pod \"multus-xwmfc\" (UID: \"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\") " pod="openshift-multus/multus-xwmfc" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.935180 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkjp8\" (UniqueName: \"kubernetes.io/projected/d021cdee-f700-4a5f-a62e-be4acbb8c62e-kube-api-access-qkjp8\") pod \"ovnkube-node-zwfsr\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.935863 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2qj9\" (UniqueName: \"kubernetes.io/projected/a0932050-dced-4c05-b9d2-d8db1db0dceb-kube-api-access-r2qj9\") pod \"multus-additional-cni-plugins-hcxv8\" (UID: \"a0932050-dced-4c05-b9d2-d8db1db0dceb\") " pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.954618 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.954686 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.954701 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.954736 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.954756 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:17Z","lastTransitionTime":"2026-03-17T11:13:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:17 crc kubenswrapper[4742]: I0317 11:13:17.981564 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 17 11:13:17 crc kubenswrapper[4742]: W0317 11:13:17.994721 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-a8f39994eb2bdaed63527e569d02df4edbddbde10b92fa4cbef8273639fd21db WatchSource:0}: Error finding container a8f39994eb2bdaed63527e569d02df4edbddbde10b92fa4cbef8273639fd21db: Status 404 returned error can't find the container with id a8f39994eb2bdaed63527e569d02df4edbddbde10b92fa4cbef8273639fd21db Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.997704 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:17 crc kubenswrapper[4742]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 17 11:13:17 crc kubenswrapper[4742]: set -o allexport Mar 17 11:13:17 crc kubenswrapper[4742]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 17 11:13:17 crc kubenswrapper[4742]: source /etc/kubernetes/apiserver-url.env Mar 17 11:13:17 crc kubenswrapper[4742]: else Mar 17 11:13:17 crc kubenswrapper[4742]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 17 11:13:17 crc kubenswrapper[4742]: exit 1 Mar 17 11:13:17 crc kubenswrapper[4742]: fi Mar 17 11:13:17 crc kubenswrapper[4742]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 17 11:13:17 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:17 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:17 crc kubenswrapper[4742]: E0317 11:13:17.998950 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.002299 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 17 11:13:18 crc kubenswrapper[4742]: W0317 11:13:18.015314 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-499e28956943b0c312809d95393748587a10460e327b3f92cc6bbb72b26a1d35 WatchSource:0}: Error finding container 499e28956943b0c312809d95393748587a10460e327b3f92cc6bbb72b26a1d35: Status 404 returned error can't find the container with id 499e28956943b0c312809d95393748587a10460e327b3f92cc6bbb72b26a1d35 Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.017788 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:18 crc kubenswrapper[4742]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 11:13:18 crc kubenswrapper[4742]: if [[ -f "/env/_master" ]]; then Mar 17 11:13:18 crc kubenswrapper[4742]: set -o allexport Mar 17 11:13:18 crc kubenswrapper[4742]: source "/env/_master" Mar 17 11:13:18 crc kubenswrapper[4742]: set +o allexport Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 17 11:13:18 crc kubenswrapper[4742]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 17 11:13:18 crc kubenswrapper[4742]: ho_enable="--enable-hybrid-overlay" Mar 17 11:13:18 crc kubenswrapper[4742]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 17 11:13:18 crc kubenswrapper[4742]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 17 11:13:18 crc kubenswrapper[4742]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 17 11:13:18 crc kubenswrapper[4742]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 17 11:13:18 crc kubenswrapper[4742]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 17 11:13:18 crc kubenswrapper[4742]: --webhook-host=127.0.0.1 \ Mar 17 11:13:18 crc kubenswrapper[4742]: --webhook-port=9743 \ Mar 17 11:13:18 crc kubenswrapper[4742]: ${ho_enable} \ Mar 17 11:13:18 crc kubenswrapper[4742]: --enable-interconnect \ Mar 17 11:13:18 crc kubenswrapper[4742]: --disable-approver \ Mar 17 11:13:18 crc kubenswrapper[4742]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 17 11:13:18 crc kubenswrapper[4742]: --wait-for-kubernetes-api=200s \ Mar 17 11:13:18 crc kubenswrapper[4742]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 17 11:13:18 crc kubenswrapper[4742]: --loglevel="${LOGLEVEL}" Mar 17 11:13:18 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:18 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.017882 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.023256 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:18 crc kubenswrapper[4742]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 11:13:18 crc kubenswrapper[4742]: if [[ -f "/env/_master" ]]; then Mar 17 11:13:18 crc kubenswrapper[4742]: set -o allexport Mar 17 11:13:18 crc kubenswrapper[4742]: source "/env/_master" Mar 17 11:13:18 crc kubenswrapper[4742]: set +o allexport Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 17 11:13:18 crc kubenswrapper[4742]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 17 11:13:18 crc kubenswrapper[4742]: --disable-webhook \ Mar 17 11:13:18 crc kubenswrapper[4742]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 17 11:13:18 crc kubenswrapper[4742]: --loglevel="${LOGLEVEL}" Mar 17 11:13:18 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:18 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.024623 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.025747 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kwrj5" Mar 17 11:13:18 crc kubenswrapper[4742]: W0317 11:13:18.031232 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-f5c48a451e84b50935fff41df89b0c91da54bd20fc9b1b15f626985e3bb73200 WatchSource:0}: Error finding container f5c48a451e84b50935fff41df89b0c91da54bd20fc9b1b15f626985e3bb73200: Status 404 returned error can't find the container with id f5c48a451e84b50935fff41df89b0c91da54bd20fc9b1b15f626985e3bb73200 Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.039073 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.040476 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 17 11:13:18 crc kubenswrapper[4742]: W0317 11:13:18.043096 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa31fa5e_119d_4392_b5c6_8f4a488e64af.slice/crio-d4085447469cff6773671984e85dfb5f3a4bab64150151e1901488f4e61bdac1 WatchSource:0}: Error finding container d4085447469cff6773671984e85dfb5f3a4bab64150151e1901488f4e61bdac1: Status 404 returned error can't find the container with id d4085447469cff6773671984e85dfb5f3a4bab64150151e1901488f4e61bdac1 Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.043151 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.045505 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:18 crc kubenswrapper[4742]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 17 11:13:18 crc kubenswrapper[4742]: set -uo pipefail Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 17 11:13:18 crc kubenswrapper[4742]: HOSTS_FILE="/etc/hosts" Mar 17 11:13:18 crc kubenswrapper[4742]: TEMP_FILE="/etc/hosts.tmp" Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: # Make a temporary file with the old hosts file's attributes. Mar 17 11:13:18 crc kubenswrapper[4742]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 17 11:13:18 crc kubenswrapper[4742]: echo "Failed to preserve hosts file. Exiting." Mar 17 11:13:18 crc kubenswrapper[4742]: exit 1 Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: while true; do Mar 17 11:13:18 crc kubenswrapper[4742]: declare -A svc_ips Mar 17 11:13:18 crc kubenswrapper[4742]: for svc in "${services[@]}"; do Mar 17 11:13:18 crc kubenswrapper[4742]: # Fetch service IP from cluster dns if present. We make several tries Mar 17 11:13:18 crc kubenswrapper[4742]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 17 11:13:18 crc kubenswrapper[4742]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 17 11:13:18 crc kubenswrapper[4742]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 17 11:13:18 crc kubenswrapper[4742]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 11:13:18 crc kubenswrapper[4742]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 11:13:18 crc kubenswrapper[4742]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 11:13:18 crc kubenswrapper[4742]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 17 11:13:18 crc kubenswrapper[4742]: for i in ${!cmds[*]} Mar 17 11:13:18 crc kubenswrapper[4742]: do Mar 17 11:13:18 crc kubenswrapper[4742]: ips=($(eval "${cmds[i]}")) Mar 17 11:13:18 crc kubenswrapper[4742]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 17 11:13:18 crc kubenswrapper[4742]: svc_ips["${svc}"]="${ips[@]}" Mar 17 11:13:18 crc kubenswrapper[4742]: break Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: done Mar 17 11:13:18 crc kubenswrapper[4742]: done Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: # Update /etc/hosts only if we get valid service IPs Mar 17 11:13:18 crc kubenswrapper[4742]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 17 11:13:18 crc kubenswrapper[4742]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 17 11:13:18 crc kubenswrapper[4742]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 17 11:13:18 crc kubenswrapper[4742]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 17 11:13:18 crc kubenswrapper[4742]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 17 11:13:18 crc kubenswrapper[4742]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 17 11:13:18 crc kubenswrapper[4742]: sleep 60 & wait Mar 17 11:13:18 crc kubenswrapper[4742]: continue Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: # Append resolver entries for services Mar 17 11:13:18 crc kubenswrapper[4742]: rc=0 Mar 17 11:13:18 crc kubenswrapper[4742]: for svc in "${!svc_ips[@]}"; do Mar 17 11:13:18 crc kubenswrapper[4742]: for ip in ${svc_ips[${svc}]}; do Mar 17 11:13:18 crc kubenswrapper[4742]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 17 11:13:18 crc kubenswrapper[4742]: done Mar 17 11:13:18 crc kubenswrapper[4742]: done Mar 17 11:13:18 crc kubenswrapper[4742]: if [[ $rc -ne 0 ]]; then Mar 17 11:13:18 crc kubenswrapper[4742]: sleep 60 & wait Mar 17 11:13:18 crc kubenswrapper[4742]: continue Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 17 11:13:18 crc kubenswrapper[4742]: # Replace /etc/hosts with our modified version if needed Mar 17 11:13:18 crc kubenswrapper[4742]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 17 11:13:18 crc kubenswrapper[4742]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: sleep 60 & wait Mar 17 11:13:18 crc kubenswrapper[4742]: unset svc_ips Mar 17 11:13:18 crc kubenswrapper[4742]: done Mar 17 11:13:18 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5w4nh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-kwrj5_openshift-dns(fa31fa5e-119d-4392-b5c6-8f4a488e64af): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:18 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.046783 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-kwrj5" podUID="fa31fa5e-119d-4392-b5c6-8f4a488e64af" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.050571 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.058284 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.058327 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.058342 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.058363 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.058375 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.058965 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xwmfc" Mar 17 11:13:18 crc kubenswrapper[4742]: W0317 11:13:18.059957 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e11ad39_38bb_4b70_9cac_ce078b37f882.slice/crio-58783a871a3096f01db272dd7b2270f1fa51dd965a48b1de91fb428dd3176089 WatchSource:0}: Error finding container 58783a871a3096f01db272dd7b2270f1fa51dd965a48b1de91fb428dd3176089: Status 404 returned error can't find the container with id 58783a871a3096f01db272dd7b2270f1fa51dd965a48b1de91fb428dd3176089 Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.064446 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.064992 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vmpzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.067766 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2qj9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-hcxv8_openshift-multus(a0932050-dced-4c05-b9d2-d8db1db0dceb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.067813 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vmpzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.069204 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" podUID="a0932050-dced-4c05-b9d2-d8db1db0dceb" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.069352 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.074953 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:18 crc kubenswrapper[4742]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 17 11:13:18 crc kubenswrapper[4742]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 17 11:13:18 crc kubenswrapper[4742]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4w98f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-xwmfc_openshift-multus(ff1068ee-5ebe-4575-806d-967a3b9bfb6a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:18 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.076156 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-xwmfc" podUID="ff1068ee-5ebe-4575-806d-967a3b9bfb6a" Mar 17 11:13:18 crc kubenswrapper[4742]: W0317 11:13:18.081135 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd021cdee_f700_4a5f_a62e_be4acbb8c62e.slice/crio-61da21cdaeb0ecf937ece364594b3e839720124913fb881b02094fdd7fe63e87 WatchSource:0}: Error finding container 61da21cdaeb0ecf937ece364594b3e839720124913fb881b02094fdd7fe63e87: Status 404 returned error can't find the container with id 61da21cdaeb0ecf937ece364594b3e839720124913fb881b02094fdd7fe63e87 Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.084102 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:18 crc kubenswrapper[4742]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 17 11:13:18 crc kubenswrapper[4742]: apiVersion: v1 Mar 17 11:13:18 crc kubenswrapper[4742]: clusters: Mar 17 11:13:18 crc kubenswrapper[4742]: - cluster: Mar 17 11:13:18 crc kubenswrapper[4742]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 17 11:13:18 crc kubenswrapper[4742]: server: https://api-int.crc.testing:6443 Mar 17 11:13:18 crc kubenswrapper[4742]: name: default-cluster Mar 17 11:13:18 crc kubenswrapper[4742]: contexts: Mar 17 11:13:18 crc kubenswrapper[4742]: - context: Mar 17 11:13:18 crc kubenswrapper[4742]: cluster: default-cluster Mar 17 11:13:18 crc kubenswrapper[4742]: namespace: default Mar 17 11:13:18 crc kubenswrapper[4742]: user: default-auth Mar 17 11:13:18 crc kubenswrapper[4742]: name: default-context Mar 17 11:13:18 crc kubenswrapper[4742]: current-context: default-context Mar 17 11:13:18 crc kubenswrapper[4742]: kind: Config Mar 17 11:13:18 crc kubenswrapper[4742]: preferences: {} Mar 17 11:13:18 crc kubenswrapper[4742]: users: Mar 17 11:13:18 crc kubenswrapper[4742]: - name: default-auth Mar 17 11:13:18 crc kubenswrapper[4742]: user: Mar 17 11:13:18 crc kubenswrapper[4742]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 17 11:13:18 crc kubenswrapper[4742]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 17 11:13:18 crc kubenswrapper[4742]: EOF Mar 17 11:13:18 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qkjp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:18 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.085944 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.161635 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.161698 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.161719 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.161747 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.161767 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.163919 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwmfc" event={"ID":"ff1068ee-5ebe-4575-806d-967a3b9bfb6a","Type":"ContainerStarted","Data":"f0e0a82c8473a823899c82fae9f3a888dd27b1b6e209eb900b4e3168dd63d414"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.165662 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"58783a871a3096f01db272dd7b2270f1fa51dd965a48b1de91fb428dd3176089"} Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.166236 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:18 crc kubenswrapper[4742]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 17 11:13:18 crc kubenswrapper[4742]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 17 11:13:18 crc kubenswrapper[4742]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4w98f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-xwmfc_openshift-multus(ff1068ee-5ebe-4575-806d-967a3b9bfb6a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:18 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.167087 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" event={"ID":"a0932050-dced-4c05-b9d2-d8db1db0dceb","Type":"ContainerStarted","Data":"1ad8a3ac7a84cf01e6ea6618f2e5aa36047a4002f5ec0df3138332cff696ae82"} Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.167407 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-xwmfc" podUID="ff1068ee-5ebe-4575-806d-967a3b9bfb6a" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.168713 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2qj9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-hcxv8_openshift-multus(a0932050-dced-4c05-b9d2-d8db1db0dceb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.168868 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kwrj5" event={"ID":"fa31fa5e-119d-4392-b5c6-8f4a488e64af","Type":"ContainerStarted","Data":"d4085447469cff6773671984e85dfb5f3a4bab64150151e1901488f4e61bdac1"} Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.169046 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vmpzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.169951 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" podUID="a0932050-dced-4c05-b9d2-d8db1db0dceb" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.170422 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"499e28956943b0c312809d95393748587a10460e327b3f92cc6bbb72b26a1d35"} Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.170668 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:18 crc kubenswrapper[4742]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 17 11:13:18 crc kubenswrapper[4742]: set -uo pipefail Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 17 11:13:18 crc kubenswrapper[4742]: HOSTS_FILE="/etc/hosts" Mar 17 11:13:18 crc kubenswrapper[4742]: TEMP_FILE="/etc/hosts.tmp" Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: # Make a temporary file with the old hosts file's attributes. Mar 17 11:13:18 crc kubenswrapper[4742]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 17 11:13:18 crc kubenswrapper[4742]: echo "Failed to preserve hosts file. Exiting." Mar 17 11:13:18 crc kubenswrapper[4742]: exit 1 Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: while true; do Mar 17 11:13:18 crc kubenswrapper[4742]: declare -A svc_ips Mar 17 11:13:18 crc kubenswrapper[4742]: for svc in "${services[@]}"; do Mar 17 11:13:18 crc kubenswrapper[4742]: # Fetch service IP from cluster dns if present. We make several tries Mar 17 11:13:18 crc kubenswrapper[4742]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 17 11:13:18 crc kubenswrapper[4742]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 17 11:13:18 crc kubenswrapper[4742]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 17 11:13:18 crc kubenswrapper[4742]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 11:13:18 crc kubenswrapper[4742]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 11:13:18 crc kubenswrapper[4742]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 11:13:18 crc kubenswrapper[4742]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 17 11:13:18 crc kubenswrapper[4742]: for i in ${!cmds[*]} Mar 17 11:13:18 crc kubenswrapper[4742]: do Mar 17 11:13:18 crc kubenswrapper[4742]: ips=($(eval "${cmds[i]}")) Mar 17 11:13:18 crc kubenswrapper[4742]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 17 11:13:18 crc kubenswrapper[4742]: svc_ips["${svc}"]="${ips[@]}" Mar 17 11:13:18 crc kubenswrapper[4742]: break Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: done Mar 17 11:13:18 crc kubenswrapper[4742]: done Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: # Update /etc/hosts only if we get valid service IPs Mar 17 11:13:18 crc kubenswrapper[4742]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 17 11:13:18 crc kubenswrapper[4742]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 17 11:13:18 crc kubenswrapper[4742]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 17 11:13:18 crc kubenswrapper[4742]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 17 11:13:18 crc kubenswrapper[4742]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 17 11:13:18 crc kubenswrapper[4742]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 17 11:13:18 crc kubenswrapper[4742]: sleep 60 & wait Mar 17 11:13:18 crc kubenswrapper[4742]: continue Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: # Append resolver entries for services Mar 17 11:13:18 crc kubenswrapper[4742]: rc=0 Mar 17 11:13:18 crc kubenswrapper[4742]: for svc in "${!svc_ips[@]}"; do Mar 17 11:13:18 crc kubenswrapper[4742]: for ip in ${svc_ips[${svc}]}; do Mar 17 11:13:18 crc kubenswrapper[4742]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 17 11:13:18 crc kubenswrapper[4742]: done Mar 17 11:13:18 crc kubenswrapper[4742]: done Mar 17 11:13:18 crc kubenswrapper[4742]: if [[ $rc -ne 0 ]]; then Mar 17 11:13:18 crc kubenswrapper[4742]: sleep 60 & wait Mar 17 11:13:18 crc kubenswrapper[4742]: continue Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 17 11:13:18 crc kubenswrapper[4742]: # Replace /etc/hosts with our modified version if needed Mar 17 11:13:18 crc kubenswrapper[4742]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 17 11:13:18 crc kubenswrapper[4742]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: sleep 60 & wait Mar 17 11:13:18 crc kubenswrapper[4742]: unset svc_ips Mar 17 11:13:18 crc kubenswrapper[4742]: done Mar 17 11:13:18 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5w4nh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-kwrj5_openshift-dns(fa31fa5e-119d-4392-b5c6-8f4a488e64af): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:18 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.171249 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vmpzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.172261 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerStarted","Data":"61da21cdaeb0ecf937ece364594b3e839720124913fb881b02094fdd7fe63e87"} Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.172347 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:18 crc kubenswrapper[4742]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 11:13:18 crc kubenswrapper[4742]: if [[ -f "/env/_master" ]]; then Mar 17 11:13:18 crc kubenswrapper[4742]: set -o allexport Mar 17 11:13:18 crc kubenswrapper[4742]: source "/env/_master" Mar 17 11:13:18 crc kubenswrapper[4742]: set +o allexport Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 17 11:13:18 crc kubenswrapper[4742]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 17 11:13:18 crc kubenswrapper[4742]: ho_enable="--enable-hybrid-overlay" Mar 17 11:13:18 crc kubenswrapper[4742]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 17 11:13:18 crc kubenswrapper[4742]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 17 11:13:18 crc kubenswrapper[4742]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 17 11:13:18 crc kubenswrapper[4742]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 17 11:13:18 crc kubenswrapper[4742]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 17 11:13:18 crc kubenswrapper[4742]: --webhook-host=127.0.0.1 \ Mar 17 11:13:18 crc kubenswrapper[4742]: --webhook-port=9743 \ Mar 17 11:13:18 crc kubenswrapper[4742]: ${ho_enable} \ Mar 17 11:13:18 crc kubenswrapper[4742]: --enable-interconnect \ Mar 17 11:13:18 crc kubenswrapper[4742]: --disable-approver \ Mar 17 11:13:18 crc kubenswrapper[4742]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 17 11:13:18 crc kubenswrapper[4742]: --wait-for-kubernetes-api=200s \ Mar 17 11:13:18 crc kubenswrapper[4742]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 17 11:13:18 crc kubenswrapper[4742]: --loglevel="${LOGLEVEL}" Mar 17 11:13:18 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:18 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.172427 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.171799 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-kwrj5" podUID="fa31fa5e-119d-4392-b5c6-8f4a488e64af" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.179219 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:18 crc kubenswrapper[4742]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 11:13:18 crc kubenswrapper[4742]: if [[ -f "/env/_master" ]]; then Mar 17 11:13:18 crc kubenswrapper[4742]: set -o allexport Mar 17 11:13:18 crc kubenswrapper[4742]: source "/env/_master" Mar 17 11:13:18 crc kubenswrapper[4742]: set +o allexport Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: Mar 17 11:13:18 crc kubenswrapper[4742]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 17 11:13:18 crc kubenswrapper[4742]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 17 11:13:18 crc kubenswrapper[4742]: --disable-webhook \ Mar 17 11:13:18 crc kubenswrapper[4742]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 17 11:13:18 crc kubenswrapper[4742]: --loglevel="${LOGLEVEL}" Mar 17 11:13:18 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:18 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.179304 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a8f39994eb2bdaed63527e569d02df4edbddbde10b92fa4cbef8273639fd21db"} Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.180037 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:18 crc kubenswrapper[4742]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 17 11:13:18 crc kubenswrapper[4742]: apiVersion: v1 Mar 17 11:13:18 crc kubenswrapper[4742]: clusters: Mar 17 11:13:18 crc kubenswrapper[4742]: - cluster: Mar 17 11:13:18 crc kubenswrapper[4742]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 17 11:13:18 crc kubenswrapper[4742]: server: https://api-int.crc.testing:6443 Mar 17 11:13:18 crc kubenswrapper[4742]: name: default-cluster Mar 17 11:13:18 crc kubenswrapper[4742]: contexts: Mar 17 11:13:18 crc kubenswrapper[4742]: - context: Mar 17 11:13:18 crc kubenswrapper[4742]: cluster: default-cluster Mar 17 11:13:18 crc kubenswrapper[4742]: namespace: default Mar 17 11:13:18 crc kubenswrapper[4742]: user: default-auth Mar 17 11:13:18 crc kubenswrapper[4742]: name: default-context Mar 17 11:13:18 crc kubenswrapper[4742]: current-context: default-context Mar 17 11:13:18 crc kubenswrapper[4742]: kind: Config Mar 17 11:13:18 crc kubenswrapper[4742]: preferences: {} Mar 17 11:13:18 crc kubenswrapper[4742]: users: Mar 17 11:13:18 crc kubenswrapper[4742]: - name: default-auth Mar 17 11:13:18 crc kubenswrapper[4742]: user: Mar 17 11:13:18 crc kubenswrapper[4742]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 17 11:13:18 crc kubenswrapper[4742]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 17 11:13:18 crc kubenswrapper[4742]: EOF Mar 17 11:13:18 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qkjp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:18 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.180422 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.181415 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.182198 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.182609 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:18 crc kubenswrapper[4742]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 17 11:13:18 crc kubenswrapper[4742]: set -o allexport Mar 17 11:13:18 crc kubenswrapper[4742]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 17 11:13:18 crc kubenswrapper[4742]: source /etc/kubernetes/apiserver-url.env Mar 17 11:13:18 crc kubenswrapper[4742]: else Mar 17 11:13:18 crc kubenswrapper[4742]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 17 11:13:18 crc kubenswrapper[4742]: exit 1 Mar 17 11:13:18 crc kubenswrapper[4742]: fi Mar 17 11:13:18 crc kubenswrapper[4742]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 17 11:13:18 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:18 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.185080 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.185792 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.185865 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.186488 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.187351 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f5c48a451e84b50935fff41df89b0c91da54bd20fc9b1b15f626985e3bb73200"} Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.189341 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.190589 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.201463 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.232192 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.241376 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.251251 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.266167 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.267865 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.267922 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.267931 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.267948 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.267959 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.281816 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.301503 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.315830 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.323213 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.323380 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.323454 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:13:19.323414983 +0000 UTC m=+102.449542871 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.323508 4742 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.323567 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:19.323552386 +0000 UTC m=+102.449680144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.323830 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.323858 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.323988 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.323962 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.324062 4742 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.324066 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.324110 4742 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.324148 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.324218 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.324247 4742 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.324113 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:19.324105421 +0000 UTC m=+102.450233179 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.324291 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:19.324282187 +0000 UTC m=+102.450409945 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.324347 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:19.324307688 +0000 UTC m=+102.450435566 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.326381 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.344640 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.358376 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.370966 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.371008 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.371020 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.371040 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.371068 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.372273 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.388188 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.403247 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.414457 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.434100 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.450932 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.461218 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.473084 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.474398 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.474505 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.474529 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.475311 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.475396 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.478455 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.478515 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.478531 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.478552 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.478564 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.485047 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.492841 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.497206 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.497264 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.497283 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.497308 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.497331 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.503389 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.511450 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.513936 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.517584 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.517631 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.517653 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.517702 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.517727 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.520041 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.524160 4742 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.528316 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.530257 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.532171 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.532197 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.532209 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.532225 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.532237 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.540866 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.543260 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.544543 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.544591 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.544604 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.544626 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.544642 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.555211 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: E0317 11:13:18.555388 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.578245 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.578310 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.578324 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.578339 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.578349 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.669408 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.670664 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.674219 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.675765 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.678053 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.679402 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.681403 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.681870 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.681899 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.681929 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.681947 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.681961 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.684174 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.685766 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.688466 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.690272 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.691641 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.694581 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.696184 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.699320 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.699397 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.700745 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.704174 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.705392 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.706248 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.708848 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.710987 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.713889 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.714385 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.716363 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.717988 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.720599 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.721669 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.724353 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.731768 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.732465 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.733949 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.734634 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.735827 4742 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.735996 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.738435 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.739140 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.740301 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.743019 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.744164 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.744584 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.747061 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.747948 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.756278 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.756954 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.758328 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.759579 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.760335 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.761379 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.763123 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.766215 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.767930 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.768684 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.769444 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.770609 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.771301 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.772711 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.773519 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.774738 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.780462 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.785130 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.785208 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.785223 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.785270 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.785285 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.790830 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.801636 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.811211 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.826403 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.842737 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.856385 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.866849 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.887568 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.887607 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.887618 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.887636 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.887647 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.990973 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.991032 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.991056 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.991088 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:18 crc kubenswrapper[4742]: I0317 11:13:18.991112 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:18Z","lastTransitionTime":"2026-03-17T11:13:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.093539 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.093636 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.093657 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.093684 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.093706 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:19Z","lastTransitionTime":"2026-03-17T11:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.195839 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.195943 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.195963 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.195990 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.196009 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:19Z","lastTransitionTime":"2026-03-17T11:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.299378 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.299494 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.299514 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.299544 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.299563 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:19Z","lastTransitionTime":"2026-03-17T11:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.334457 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.334687 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.334778 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.334841 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.334941 4742 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.334974 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.335086 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:21.335055299 +0000 UTC m=+104.461183097 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.335156 4742 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.335263 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:21.335231965 +0000 UTC m=+104.461359933 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.335155 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.335299 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.335376 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.335391 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:13:21.335360538 +0000 UTC m=+104.461488486 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.335401 4742 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.335501 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:21.335481312 +0000 UTC m=+104.461609310 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.335321 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.335549 4742 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.335621 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:21.335602365 +0000 UTC m=+104.461730383 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.402615 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.402700 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.402719 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.402744 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.402762 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:19Z","lastTransitionTime":"2026-03-17T11:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.506589 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.506662 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.506687 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.506721 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.506747 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:19Z","lastTransitionTime":"2026-03-17T11:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.610528 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.610621 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.610645 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.610674 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.610692 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:19Z","lastTransitionTime":"2026-03-17T11:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.662307 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.662321 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.662511 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.662322 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.662657 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:19 crc kubenswrapper[4742]: E0317 11:13:19.662999 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.714246 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.714312 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.714328 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.714352 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.714366 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:19Z","lastTransitionTime":"2026-03-17T11:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.717825 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-hv2p6"] Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.718672 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hv2p6" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.724693 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.724793 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.726083 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.726533 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.736539 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.738799 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ad7af928-88e1-468c-9471-8e7902a4a6ee-serviceca\") pod \"node-ca-hv2p6\" (UID: \"ad7af928-88e1-468c-9471-8e7902a4a6ee\") " pod="openshift-image-registry/node-ca-hv2p6" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.738870 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92vc8\" (UniqueName: \"kubernetes.io/projected/ad7af928-88e1-468c-9471-8e7902a4a6ee-kube-api-access-92vc8\") pod \"node-ca-hv2p6\" (UID: \"ad7af928-88e1-468c-9471-8e7902a4a6ee\") " pod="openshift-image-registry/node-ca-hv2p6" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.738944 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad7af928-88e1-468c-9471-8e7902a4a6ee-host\") pod \"node-ca-hv2p6\" (UID: \"ad7af928-88e1-468c-9471-8e7902a4a6ee\") " pod="openshift-image-registry/node-ca-hv2p6" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.750436 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.761942 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.775302 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.793691 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.809796 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.817188 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.817294 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.817314 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.817342 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.817359 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:19Z","lastTransitionTime":"2026-03-17T11:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.824870 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.840339 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ad7af928-88e1-468c-9471-8e7902a4a6ee-serviceca\") pod \"node-ca-hv2p6\" (UID: \"ad7af928-88e1-468c-9471-8e7902a4a6ee\") " pod="openshift-image-registry/node-ca-hv2p6" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.840649 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92vc8\" (UniqueName: \"kubernetes.io/projected/ad7af928-88e1-468c-9471-8e7902a4a6ee-kube-api-access-92vc8\") pod \"node-ca-hv2p6\" (UID: \"ad7af928-88e1-468c-9471-8e7902a4a6ee\") " pod="openshift-image-registry/node-ca-hv2p6" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.840859 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad7af928-88e1-468c-9471-8e7902a4a6ee-host\") pod \"node-ca-hv2p6\" (UID: \"ad7af928-88e1-468c-9471-8e7902a4a6ee\") " pod="openshift-image-registry/node-ca-hv2p6" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.840998 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad7af928-88e1-468c-9471-8e7902a4a6ee-host\") pod \"node-ca-hv2p6\" (UID: \"ad7af928-88e1-468c-9471-8e7902a4a6ee\") " pod="openshift-image-registry/node-ca-hv2p6" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.842205 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ad7af928-88e1-468c-9471-8e7902a4a6ee-serviceca\") pod \"node-ca-hv2p6\" (UID: \"ad7af928-88e1-468c-9471-8e7902a4a6ee\") " pod="openshift-image-registry/node-ca-hv2p6" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.851043 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.865664 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92vc8\" (UniqueName: \"kubernetes.io/projected/ad7af928-88e1-468c-9471-8e7902a4a6ee-kube-api-access-92vc8\") pod \"node-ca-hv2p6\" (UID: \"ad7af928-88e1-468c-9471-8e7902a4a6ee\") " pod="openshift-image-registry/node-ca-hv2p6" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.868130 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.882873 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.896405 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.909163 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.920046 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.920404 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.920481 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.920506 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.920538 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.920561 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:19Z","lastTransitionTime":"2026-03-17T11:13:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:19 crc kubenswrapper[4742]: I0317 11:13:19.940472 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.024966 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.025031 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.025048 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.025074 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.025093 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:20Z","lastTransitionTime":"2026-03-17T11:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.043029 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hv2p6" Mar 17 11:13:20 crc kubenswrapper[4742]: W0317 11:13:20.069977 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad7af928_88e1_468c_9471_8e7902a4a6ee.slice/crio-0a88297e344ac2f53bc8900d4476b9252f7eb8cc64b3130947b15da6aa39fb8e WatchSource:0}: Error finding container 0a88297e344ac2f53bc8900d4476b9252f7eb8cc64b3130947b15da6aa39fb8e: Status 404 returned error can't find the container with id 0a88297e344ac2f53bc8900d4476b9252f7eb8cc64b3130947b15da6aa39fb8e Mar 17 11:13:20 crc kubenswrapper[4742]: E0317 11:13:20.075051 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:20 crc kubenswrapper[4742]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 17 11:13:20 crc kubenswrapper[4742]: while [ true ]; Mar 17 11:13:20 crc kubenswrapper[4742]: do Mar 17 11:13:20 crc kubenswrapper[4742]: for f in $(ls /tmp/serviceca); do Mar 17 11:13:20 crc kubenswrapper[4742]: echo $f Mar 17 11:13:20 crc kubenswrapper[4742]: ca_file_path="/tmp/serviceca/${f}" Mar 17 11:13:20 crc kubenswrapper[4742]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 17 11:13:20 crc kubenswrapper[4742]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 17 11:13:20 crc kubenswrapper[4742]: if [ -e "${reg_dir_path}" ]; then Mar 17 11:13:20 crc kubenswrapper[4742]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 17 11:13:20 crc kubenswrapper[4742]: else Mar 17 11:13:20 crc kubenswrapper[4742]: mkdir $reg_dir_path Mar 17 11:13:20 crc kubenswrapper[4742]: cp $ca_file_path $reg_dir_path/ca.crt Mar 17 11:13:20 crc kubenswrapper[4742]: fi Mar 17 11:13:20 crc kubenswrapper[4742]: done Mar 17 11:13:20 crc kubenswrapper[4742]: for d in $(ls /etc/docker/certs.d); do Mar 17 11:13:20 crc kubenswrapper[4742]: echo $d Mar 17 11:13:20 crc kubenswrapper[4742]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 17 11:13:20 crc kubenswrapper[4742]: reg_conf_path="/tmp/serviceca/${dp}" Mar 17 11:13:20 crc kubenswrapper[4742]: if [ ! -e "${reg_conf_path}" ]; then Mar 17 11:13:20 crc kubenswrapper[4742]: rm -rf /etc/docker/certs.d/$d Mar 17 11:13:20 crc kubenswrapper[4742]: fi Mar 17 11:13:20 crc kubenswrapper[4742]: done Mar 17 11:13:20 crc kubenswrapper[4742]: sleep 60 & wait ${!} Mar 17 11:13:20 crc kubenswrapper[4742]: done Mar 17 11:13:20 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-92vc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-hv2p6_openshift-image-registry(ad7af928-88e1-468c-9471-8e7902a4a6ee): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:20 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:20 crc kubenswrapper[4742]: E0317 11:13:20.076274 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-hv2p6" podUID="ad7af928-88e1-468c-9471-8e7902a4a6ee" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.128133 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.128205 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.128227 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.128257 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.128277 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:20Z","lastTransitionTime":"2026-03-17T11:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.201216 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hv2p6" event={"ID":"ad7af928-88e1-468c-9471-8e7902a4a6ee","Type":"ContainerStarted","Data":"0a88297e344ac2f53bc8900d4476b9252f7eb8cc64b3130947b15da6aa39fb8e"} Mar 17 11:13:20 crc kubenswrapper[4742]: E0317 11:13:20.204011 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:20 crc kubenswrapper[4742]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 17 11:13:20 crc kubenswrapper[4742]: while [ true ]; Mar 17 11:13:20 crc kubenswrapper[4742]: do Mar 17 11:13:20 crc kubenswrapper[4742]: for f in $(ls /tmp/serviceca); do Mar 17 11:13:20 crc kubenswrapper[4742]: echo $f Mar 17 11:13:20 crc kubenswrapper[4742]: ca_file_path="/tmp/serviceca/${f}" Mar 17 11:13:20 crc kubenswrapper[4742]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 17 11:13:20 crc kubenswrapper[4742]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 17 11:13:20 crc kubenswrapper[4742]: if [ -e "${reg_dir_path}" ]; then Mar 17 11:13:20 crc kubenswrapper[4742]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 17 11:13:20 crc kubenswrapper[4742]: else Mar 17 11:13:20 crc kubenswrapper[4742]: mkdir $reg_dir_path Mar 17 11:13:20 crc kubenswrapper[4742]: cp $ca_file_path $reg_dir_path/ca.crt Mar 17 11:13:20 crc kubenswrapper[4742]: fi Mar 17 11:13:20 crc kubenswrapper[4742]: done Mar 17 11:13:20 crc kubenswrapper[4742]: for d in $(ls /etc/docker/certs.d); do Mar 17 11:13:20 crc kubenswrapper[4742]: echo $d Mar 17 11:13:20 crc kubenswrapper[4742]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 17 11:13:20 crc kubenswrapper[4742]: reg_conf_path="/tmp/serviceca/${dp}" Mar 17 11:13:20 crc kubenswrapper[4742]: if [ ! -e "${reg_conf_path}" ]; then Mar 17 11:13:20 crc kubenswrapper[4742]: rm -rf /etc/docker/certs.d/$d Mar 17 11:13:20 crc kubenswrapper[4742]: fi Mar 17 11:13:20 crc kubenswrapper[4742]: done Mar 17 11:13:20 crc kubenswrapper[4742]: sleep 60 & wait ${!} Mar 17 11:13:20 crc kubenswrapper[4742]: done Mar 17 11:13:20 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-92vc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-hv2p6_openshift-image-registry(ad7af928-88e1-468c-9471-8e7902a4a6ee): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:20 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:20 crc kubenswrapper[4742]: E0317 11:13:20.205262 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-hv2p6" podUID="ad7af928-88e1-468c-9471-8e7902a4a6ee" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.225192 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.231760 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.232079 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.232308 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.232537 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.232754 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:20Z","lastTransitionTime":"2026-03-17T11:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.245570 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.261426 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.274550 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.293477 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.306685 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.320410 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.337083 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.337134 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.337151 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.337175 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.337193 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:20Z","lastTransitionTime":"2026-03-17T11:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.337630 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.354550 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.376656 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.395149 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.413445 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.428089 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.439475 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.439557 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.439575 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.439601 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.439619 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:20Z","lastTransitionTime":"2026-03-17T11:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.458206 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.542827 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.543018 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.543093 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.543129 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.543151 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:20Z","lastTransitionTime":"2026-03-17T11:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.646576 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.646610 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.646627 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.646647 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.646663 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:20Z","lastTransitionTime":"2026-03-17T11:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.750620 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.750679 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.750701 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.750724 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.750742 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:20Z","lastTransitionTime":"2026-03-17T11:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.853402 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.853482 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.853509 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.853537 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.853554 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:20Z","lastTransitionTime":"2026-03-17T11:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.957198 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.957253 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.957273 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.957299 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:20 crc kubenswrapper[4742]: I0317 11:13:20.957316 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:20Z","lastTransitionTime":"2026-03-17T11:13:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.060639 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.060728 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.060746 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.060770 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.060787 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:21Z","lastTransitionTime":"2026-03-17T11:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.163814 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.163877 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.163900 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.163964 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.163990 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:21Z","lastTransitionTime":"2026-03-17T11:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.267126 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.267217 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.267251 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.267285 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.267307 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:21Z","lastTransitionTime":"2026-03-17T11:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.355820 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.356126 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:13:25.356075241 +0000 UTC m=+108.482203039 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.356598 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.356798 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.357083 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.356899 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.357378 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.357145 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.357452 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.357479 4742 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.357302 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.357568 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:25.357544331 +0000 UTC m=+108.483672269 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.357275 4742 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.357750 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:25.357720656 +0000 UTC m=+108.483848454 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.357403 4742 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.357816 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:25.357803068 +0000 UTC m=+108.483930866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.358006 4742 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.358059 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:25.358043655 +0000 UTC m=+108.484171443 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.370545 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.370633 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.370662 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.370699 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.370722 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:21Z","lastTransitionTime":"2026-03-17T11:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.474143 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.474195 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.474213 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.474235 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.474252 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:21Z","lastTransitionTime":"2026-03-17T11:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.578207 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.578267 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.578289 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.578314 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.578331 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:21Z","lastTransitionTime":"2026-03-17T11:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.662162 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.662211 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.662228 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.662365 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.662503 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:21 crc kubenswrapper[4742]: E0317 11:13:21.662649 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.680858 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.681269 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.681412 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.681556 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.681696 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:21Z","lastTransitionTime":"2026-03-17T11:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.785132 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.785198 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.785217 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.785244 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.785265 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:21Z","lastTransitionTime":"2026-03-17T11:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.888129 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.888328 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.888362 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.888398 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.888423 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:21Z","lastTransitionTime":"2026-03-17T11:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.991107 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.991181 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.991203 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.991228 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:21 crc kubenswrapper[4742]: I0317 11:13:21.991246 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:21Z","lastTransitionTime":"2026-03-17T11:13:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.094587 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.094649 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.094667 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.094691 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.094710 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:22Z","lastTransitionTime":"2026-03-17T11:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.197998 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.198073 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.198092 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.198121 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.198139 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:22Z","lastTransitionTime":"2026-03-17T11:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.302045 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.302110 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.302128 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.302155 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.302173 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:22Z","lastTransitionTime":"2026-03-17T11:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.405644 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.405710 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.405736 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.405762 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.405779 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:22Z","lastTransitionTime":"2026-03-17T11:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.509385 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.509465 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.509482 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.509508 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.509527 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:22Z","lastTransitionTime":"2026-03-17T11:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.612557 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.612596 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.612607 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.612624 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.612636 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:22Z","lastTransitionTime":"2026-03-17T11:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.715646 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.715729 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.715755 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.715794 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.715819 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:22Z","lastTransitionTime":"2026-03-17T11:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.819559 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.819644 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.819677 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.819711 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.819733 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:22Z","lastTransitionTime":"2026-03-17T11:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.922626 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.922685 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.922705 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.922731 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:22 crc kubenswrapper[4742]: I0317 11:13:22.922749 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:22Z","lastTransitionTime":"2026-03-17T11:13:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.026001 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.026066 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.026090 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.026123 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.026149 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:23Z","lastTransitionTime":"2026-03-17T11:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.128817 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.128878 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.128897 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.128962 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.128989 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:23Z","lastTransitionTime":"2026-03-17T11:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.232180 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.232250 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.232269 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.232293 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.232309 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:23Z","lastTransitionTime":"2026-03-17T11:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.335815 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.335888 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.335939 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.335990 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.336010 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:23Z","lastTransitionTime":"2026-03-17T11:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.438292 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.438347 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.438357 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.438377 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.438389 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:23Z","lastTransitionTime":"2026-03-17T11:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.541891 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.542012 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.542036 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.542066 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.542086 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:23Z","lastTransitionTime":"2026-03-17T11:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.645547 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.645609 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.645633 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.645664 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.645687 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:23Z","lastTransitionTime":"2026-03-17T11:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.662598 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.662639 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:23 crc kubenswrapper[4742]: E0317 11:13:23.662751 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.662607 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:23 crc kubenswrapper[4742]: E0317 11:13:23.663049 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:23 crc kubenswrapper[4742]: E0317 11:13:23.663172 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.749532 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.749603 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.749630 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.749663 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.749685 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:23Z","lastTransitionTime":"2026-03-17T11:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.853223 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.853305 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.853323 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.853350 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.853376 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:23Z","lastTransitionTime":"2026-03-17T11:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.956347 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.956419 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.956438 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.956469 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:23 crc kubenswrapper[4742]: I0317 11:13:23.956488 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:23Z","lastTransitionTime":"2026-03-17T11:13:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.059569 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.059641 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.059659 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.059685 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.059703 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:24Z","lastTransitionTime":"2026-03-17T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.162824 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.162895 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.162942 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.162970 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.162988 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:24Z","lastTransitionTime":"2026-03-17T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.266181 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.266242 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.266263 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.266297 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.266321 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:24Z","lastTransitionTime":"2026-03-17T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.369840 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.369899 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.369943 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.369965 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.369980 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:24Z","lastTransitionTime":"2026-03-17T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.473420 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.473479 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.473498 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.473521 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.473538 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:24Z","lastTransitionTime":"2026-03-17T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.576355 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.576424 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.576442 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.576470 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.576489 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:24Z","lastTransitionTime":"2026-03-17T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.679394 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.679482 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.679502 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.679530 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.679551 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:24Z","lastTransitionTime":"2026-03-17T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.782747 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.782842 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.782878 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.782980 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.783009 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:24Z","lastTransitionTime":"2026-03-17T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.886019 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.886092 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.886114 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.886143 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.886160 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:24Z","lastTransitionTime":"2026-03-17T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.989053 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.989179 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.989197 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.989226 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:24 crc kubenswrapper[4742]: I0317 11:13:24.989244 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:24Z","lastTransitionTime":"2026-03-17T11:13:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.092515 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.092594 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.092612 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.092649 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.092686 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:25Z","lastTransitionTime":"2026-03-17T11:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.196633 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.196688 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.196707 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.196730 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.196748 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:25Z","lastTransitionTime":"2026-03-17T11:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.300607 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.300679 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.300703 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.300735 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.300758 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:25Z","lastTransitionTime":"2026-03-17T11:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.402259 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.402443 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.402533 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:13:33.402495505 +0000 UTC m=+116.528623303 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.402592 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.402699 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.402742 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.402767 4742 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.402814 4742 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.403040 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.403069 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.403091 4742 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.402829 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:33.402809854 +0000 UTC m=+116.528937642 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.403171 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:33.403147783 +0000 UTC m=+116.529275581 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.403203 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:33.403187924 +0000 UTC m=+116.529315732 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.402697 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.403298 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.403399 4742 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.403462 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:33.403444022 +0000 UTC m=+116.529571820 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.404223 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.404284 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.404306 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.404336 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.404359 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:25Z","lastTransitionTime":"2026-03-17T11:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.507822 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.508291 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.508437 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.508604 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.508756 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:25Z","lastTransitionTime":"2026-03-17T11:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.612121 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.612192 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.612210 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.612241 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.612259 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:25Z","lastTransitionTime":"2026-03-17T11:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.640293 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9"] Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.641567 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.644776 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.645234 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.662024 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.662026 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.663214 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.662137 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.663394 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:25 crc kubenswrapper[4742]: E0317 11:13:25.663570 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.679627 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.696942 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.706456 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8c53ad4-b584-48be-8055-a928c8a0178f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qv2v9\" (UID: \"b8c53ad4-b584-48be-8055-a928c8a0178f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.706575 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8c53ad4-b584-48be-8055-a928c8a0178f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qv2v9\" (UID: \"b8c53ad4-b584-48be-8055-a928c8a0178f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.706631 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kmch\" (UniqueName: \"kubernetes.io/projected/b8c53ad4-b584-48be-8055-a928c8a0178f-kube-api-access-5kmch\") pod \"ovnkube-control-plane-749d76644c-qv2v9\" (UID: \"b8c53ad4-b584-48be-8055-a928c8a0178f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.706709 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8c53ad4-b584-48be-8055-a928c8a0178f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qv2v9\" (UID: \"b8c53ad4-b584-48be-8055-a928c8a0178f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.715375 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.725219 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.725270 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.725288 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.725313 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.725333 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:25Z","lastTransitionTime":"2026-03-17T11:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.728132 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.745000 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.757749 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.774357 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.789693 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.806006 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.807963 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8c53ad4-b584-48be-8055-a928c8a0178f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qv2v9\" (UID: \"b8c53ad4-b584-48be-8055-a928c8a0178f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.809003 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8c53ad4-b584-48be-8055-a928c8a0178f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qv2v9\" (UID: \"b8c53ad4-b584-48be-8055-a928c8a0178f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.809057 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8c53ad4-b584-48be-8055-a928c8a0178f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qv2v9\" (UID: \"b8c53ad4-b584-48be-8055-a928c8a0178f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.809119 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kmch\" (UniqueName: \"kubernetes.io/projected/b8c53ad4-b584-48be-8055-a928c8a0178f-kube-api-access-5kmch\") pod \"ovnkube-control-plane-749d76644c-qv2v9\" (UID: \"b8c53ad4-b584-48be-8055-a928c8a0178f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.816849 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8c53ad4-b584-48be-8055-a928c8a0178f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qv2v9\" (UID: \"b8c53ad4-b584-48be-8055-a928c8a0178f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.818072 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8c53ad4-b584-48be-8055-a928c8a0178f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qv2v9\" (UID: \"b8c53ad4-b584-48be-8055-a928c8a0178f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.818760 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8c53ad4-b584-48be-8055-a928c8a0178f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qv2v9\" (UID: \"b8c53ad4-b584-48be-8055-a928c8a0178f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.833569 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kmch\" (UniqueName: \"kubernetes.io/projected/b8c53ad4-b584-48be-8055-a928c8a0178f-kube-api-access-5kmch\") pod \"ovnkube-control-plane-749d76644c-qv2v9\" (UID: \"b8c53ad4-b584-48be-8055-a928c8a0178f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.833783 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.833821 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.833839 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.833864 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.833882 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:25Z","lastTransitionTime":"2026-03-17T11:13:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.834971 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.846747 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.861822 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.874202 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.886748 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:25 crc kubenswrapper[4742]: I0317 11:13:25.911567 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.257895 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.259486 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.259508 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.259518 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.259543 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.259557 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:26Z","lastTransitionTime":"2026-03-17T11:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:26 crc kubenswrapper[4742]: W0317 11:13:26.278491 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8c53ad4_b584_48be_8055_a928c8a0178f.slice/crio-142a38a27c8b0d18be77fce562659fc6b994bed9ad95fca763168a83a5f8765b WatchSource:0}: Error finding container 142a38a27c8b0d18be77fce562659fc6b994bed9ad95fca763168a83a5f8765b: Status 404 returned error can't find the container with id 142a38a27c8b0d18be77fce562659fc6b994bed9ad95fca763168a83a5f8765b Mar 17 11:13:26 crc kubenswrapper[4742]: E0317 11:13:26.280981 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:26 crc kubenswrapper[4742]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 17 11:13:26 crc kubenswrapper[4742]: set -euo pipefail Mar 17 11:13:26 crc kubenswrapper[4742]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 17 11:13:26 crc kubenswrapper[4742]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 17 11:13:26 crc kubenswrapper[4742]: # As the secret mount is optional we must wait for the files to be present. Mar 17 11:13:26 crc kubenswrapper[4742]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 17 11:13:26 crc kubenswrapper[4742]: TS=$(date +%s) Mar 17 11:13:26 crc kubenswrapper[4742]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 17 11:13:26 crc kubenswrapper[4742]: HAS_LOGGED_INFO=0 Mar 17 11:13:26 crc kubenswrapper[4742]: Mar 17 11:13:26 crc kubenswrapper[4742]: log_missing_certs(){ Mar 17 11:13:26 crc kubenswrapper[4742]: CUR_TS=$(date +%s) Mar 17 11:13:26 crc kubenswrapper[4742]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 17 11:13:26 crc kubenswrapper[4742]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 17 11:13:26 crc kubenswrapper[4742]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 17 11:13:26 crc kubenswrapper[4742]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 17 11:13:26 crc kubenswrapper[4742]: HAS_LOGGED_INFO=1 Mar 17 11:13:26 crc kubenswrapper[4742]: fi Mar 17 11:13:26 crc kubenswrapper[4742]: } Mar 17 11:13:26 crc kubenswrapper[4742]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 17 11:13:26 crc kubenswrapper[4742]: log_missing_certs Mar 17 11:13:26 crc kubenswrapper[4742]: sleep 5 Mar 17 11:13:26 crc kubenswrapper[4742]: done Mar 17 11:13:26 crc kubenswrapper[4742]: Mar 17 11:13:26 crc kubenswrapper[4742]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 17 11:13:26 crc kubenswrapper[4742]: exec /usr/bin/kube-rbac-proxy \ Mar 17 11:13:26 crc kubenswrapper[4742]: --logtostderr \ Mar 17 11:13:26 crc kubenswrapper[4742]: --secure-listen-address=:9108 \ Mar 17 11:13:26 crc kubenswrapper[4742]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 17 11:13:26 crc kubenswrapper[4742]: --upstream=http://127.0.0.1:29108/ \ Mar 17 11:13:26 crc kubenswrapper[4742]: --tls-private-key-file=${TLS_PK} \ Mar 17 11:13:26 crc kubenswrapper[4742]: --tls-cert-file=${TLS_CERT} Mar 17 11:13:26 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5kmch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-qv2v9_openshift-ovn-kubernetes(b8c53ad4-b584-48be-8055-a928c8a0178f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:26 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:26 crc kubenswrapper[4742]: E0317 11:13:26.283845 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:26 crc kubenswrapper[4742]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 11:13:26 crc kubenswrapper[4742]: if [[ -f "/env/_master" ]]; then Mar 17 11:13:26 crc kubenswrapper[4742]: set -o allexport Mar 17 11:13:26 crc kubenswrapper[4742]: source "/env/_master" Mar 17 11:13:26 crc kubenswrapper[4742]: set +o allexport Mar 17 11:13:26 crc kubenswrapper[4742]: fi Mar 17 11:13:26 crc kubenswrapper[4742]: Mar 17 11:13:26 crc kubenswrapper[4742]: ovn_v4_join_subnet_opt= Mar 17 11:13:26 crc kubenswrapper[4742]: if [[ "" != "" ]]; then Mar 17 11:13:26 crc kubenswrapper[4742]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 17 11:13:26 crc kubenswrapper[4742]: fi Mar 17 11:13:26 crc kubenswrapper[4742]: ovn_v6_join_subnet_opt= Mar 17 11:13:26 crc kubenswrapper[4742]: if [[ "" != "" ]]; then Mar 17 11:13:26 crc kubenswrapper[4742]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 17 11:13:26 crc kubenswrapper[4742]: fi Mar 17 11:13:26 crc kubenswrapper[4742]: Mar 17 11:13:26 crc kubenswrapper[4742]: ovn_v4_transit_switch_subnet_opt= Mar 17 11:13:26 crc kubenswrapper[4742]: if [[ "" != "" ]]; then Mar 17 11:13:26 crc kubenswrapper[4742]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 17 11:13:26 crc kubenswrapper[4742]: fi Mar 17 11:13:26 crc kubenswrapper[4742]: ovn_v6_transit_switch_subnet_opt= Mar 17 11:13:26 crc kubenswrapper[4742]: if [[ "" != "" ]]; then Mar 17 11:13:26 crc kubenswrapper[4742]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 17 11:13:26 crc kubenswrapper[4742]: fi Mar 17 11:13:26 crc kubenswrapper[4742]: Mar 17 11:13:26 crc kubenswrapper[4742]: dns_name_resolver_enabled_flag= Mar 17 11:13:26 crc kubenswrapper[4742]: if [[ "false" == "true" ]]; then Mar 17 11:13:26 crc kubenswrapper[4742]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 17 11:13:26 crc kubenswrapper[4742]: fi Mar 17 11:13:26 crc kubenswrapper[4742]: Mar 17 11:13:26 crc kubenswrapper[4742]: persistent_ips_enabled_flag= Mar 17 11:13:26 crc kubenswrapper[4742]: if [[ "true" == "true" ]]; then Mar 17 11:13:26 crc kubenswrapper[4742]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 17 11:13:26 crc kubenswrapper[4742]: fi Mar 17 11:13:26 crc kubenswrapper[4742]: Mar 17 11:13:26 crc kubenswrapper[4742]: # This is needed so that converting clusters from GA to TP Mar 17 11:13:26 crc kubenswrapper[4742]: # will rollout control plane pods as well Mar 17 11:13:26 crc kubenswrapper[4742]: network_segmentation_enabled_flag= Mar 17 11:13:26 crc kubenswrapper[4742]: multi_network_enabled_flag= Mar 17 11:13:26 crc kubenswrapper[4742]: if [[ "true" == "true" ]]; then Mar 17 11:13:26 crc kubenswrapper[4742]: multi_network_enabled_flag="--enable-multi-network" Mar 17 11:13:26 crc kubenswrapper[4742]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 17 11:13:26 crc kubenswrapper[4742]: fi Mar 17 11:13:26 crc kubenswrapper[4742]: Mar 17 11:13:26 crc kubenswrapper[4742]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 17 11:13:26 crc kubenswrapper[4742]: exec /usr/bin/ovnkube \ Mar 17 11:13:26 crc kubenswrapper[4742]: --enable-interconnect \ Mar 17 11:13:26 crc kubenswrapper[4742]: --init-cluster-manager "${K8S_NODE}" \ Mar 17 11:13:26 crc kubenswrapper[4742]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 17 11:13:26 crc kubenswrapper[4742]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 17 11:13:26 crc kubenswrapper[4742]: --metrics-bind-address "127.0.0.1:29108" \ Mar 17 11:13:26 crc kubenswrapper[4742]: --metrics-enable-pprof \ Mar 17 11:13:26 crc kubenswrapper[4742]: --metrics-enable-config-duration \ Mar 17 11:13:26 crc kubenswrapper[4742]: ${ovn_v4_join_subnet_opt} \ Mar 17 11:13:26 crc kubenswrapper[4742]: ${ovn_v6_join_subnet_opt} \ Mar 17 11:13:26 crc kubenswrapper[4742]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 17 11:13:26 crc kubenswrapper[4742]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 17 11:13:26 crc kubenswrapper[4742]: ${dns_name_resolver_enabled_flag} \ Mar 17 11:13:26 crc kubenswrapper[4742]: ${persistent_ips_enabled_flag} \ Mar 17 11:13:26 crc kubenswrapper[4742]: ${multi_network_enabled_flag} \ Mar 17 11:13:26 crc kubenswrapper[4742]: ${network_segmentation_enabled_flag} Mar 17 11:13:26 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5kmch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-qv2v9_openshift-ovn-kubernetes(b8c53ad4-b584-48be-8055-a928c8a0178f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:26 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:26 crc kubenswrapper[4742]: E0317 11:13:26.285069 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" podUID="b8c53ad4-b584-48be-8055-a928c8a0178f" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.362758 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.363136 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.363338 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.363539 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.363757 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:26Z","lastTransitionTime":"2026-03-17T11:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.467308 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.467379 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.467401 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.467431 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.467450 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:26Z","lastTransitionTime":"2026-03-17T11:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.523154 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-drnx8"] Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.523584 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:26 crc kubenswrapper[4742]: E0317 11:13:26.523651 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.542002 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.556348 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.559031 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rtzs\" (UniqueName: \"kubernetes.io/projected/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-kube-api-access-2rtzs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.559118 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.567534 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.571056 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.571356 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.571568 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.571764 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.572004 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:26Z","lastTransitionTime":"2026-03-17T11:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.584021 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.596390 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.613327 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.624213 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.635089 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.648061 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.659683 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rtzs\" (UniqueName: \"kubernetes.io/projected/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-kube-api-access-2rtzs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.659716 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:26 crc kubenswrapper[4742]: E0317 11:13:26.659848 4742 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:26 crc kubenswrapper[4742]: E0317 11:13:26.659898 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs podName:6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:27.159885361 +0000 UTC m=+110.286013119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs") pod "network-metrics-daemon-drnx8" (UID: "6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.667040 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.677872 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.679123 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.679189 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.679207 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.679226 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.679241 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:26Z","lastTransitionTime":"2026-03-17T11:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.688495 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rtzs\" (UniqueName: \"kubernetes.io/projected/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-kube-api-access-2rtzs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.690001 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.699400 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.710630 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.722815 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.732114 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.782384 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.782897 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.783097 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.783239 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.783379 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:26Z","lastTransitionTime":"2026-03-17T11:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.886353 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.886679 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.886802 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.886893 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.886996 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:26Z","lastTransitionTime":"2026-03-17T11:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.989937 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.990318 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.990463 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.990688 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:26 crc kubenswrapper[4742]: I0317 11:13:26.990970 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:26Z","lastTransitionTime":"2026-03-17T11:13:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.094248 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.094306 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.094324 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.094352 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.094370 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:27Z","lastTransitionTime":"2026-03-17T11:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.165570 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:27 crc kubenswrapper[4742]: E0317 11:13:27.165848 4742 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:27 crc kubenswrapper[4742]: E0317 11:13:27.166269 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs podName:6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:28.166241819 +0000 UTC m=+111.292369607 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs") pod "network-metrics-daemon-drnx8" (UID: "6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.197543 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.197613 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.197636 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.197665 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.197686 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:27Z","lastTransitionTime":"2026-03-17T11:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.265063 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" event={"ID":"b8c53ad4-b584-48be-8055-a928c8a0178f","Type":"ContainerStarted","Data":"142a38a27c8b0d18be77fce562659fc6b994bed9ad95fca763168a83a5f8765b"} Mar 17 11:13:27 crc kubenswrapper[4742]: E0317 11:13:27.267536 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:27 crc kubenswrapper[4742]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 17 11:13:27 crc kubenswrapper[4742]: set -euo pipefail Mar 17 11:13:27 crc kubenswrapper[4742]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 17 11:13:27 crc kubenswrapper[4742]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 17 11:13:27 crc kubenswrapper[4742]: # As the secret mount is optional we must wait for the files to be present. Mar 17 11:13:27 crc kubenswrapper[4742]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 17 11:13:27 crc kubenswrapper[4742]: TS=$(date +%s) Mar 17 11:13:27 crc kubenswrapper[4742]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 17 11:13:27 crc kubenswrapper[4742]: HAS_LOGGED_INFO=0 Mar 17 11:13:27 crc kubenswrapper[4742]: Mar 17 11:13:27 crc kubenswrapper[4742]: log_missing_certs(){ Mar 17 11:13:27 crc kubenswrapper[4742]: CUR_TS=$(date +%s) Mar 17 11:13:27 crc kubenswrapper[4742]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 17 11:13:27 crc kubenswrapper[4742]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 17 11:13:27 crc kubenswrapper[4742]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 17 11:13:27 crc kubenswrapper[4742]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 17 11:13:27 crc kubenswrapper[4742]: HAS_LOGGED_INFO=1 Mar 17 11:13:27 crc kubenswrapper[4742]: fi Mar 17 11:13:27 crc kubenswrapper[4742]: } Mar 17 11:13:27 crc kubenswrapper[4742]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 17 11:13:27 crc kubenswrapper[4742]: log_missing_certs Mar 17 11:13:27 crc kubenswrapper[4742]: sleep 5 Mar 17 11:13:27 crc kubenswrapper[4742]: done Mar 17 11:13:27 crc kubenswrapper[4742]: Mar 17 11:13:27 crc kubenswrapper[4742]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 17 11:13:27 crc kubenswrapper[4742]: exec /usr/bin/kube-rbac-proxy \ Mar 17 11:13:27 crc kubenswrapper[4742]: --logtostderr \ Mar 17 11:13:27 crc kubenswrapper[4742]: --secure-listen-address=:9108 \ Mar 17 11:13:27 crc kubenswrapper[4742]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 17 11:13:27 crc kubenswrapper[4742]: --upstream=http://127.0.0.1:29108/ \ Mar 17 11:13:27 crc kubenswrapper[4742]: --tls-private-key-file=${TLS_PK} \ Mar 17 11:13:27 crc kubenswrapper[4742]: --tls-cert-file=${TLS_CERT} Mar 17 11:13:27 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5kmch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-qv2v9_openshift-ovn-kubernetes(b8c53ad4-b584-48be-8055-a928c8a0178f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:27 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:27 crc kubenswrapper[4742]: E0317 11:13:27.270353 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:27 crc kubenswrapper[4742]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 11:13:27 crc kubenswrapper[4742]: if [[ -f "/env/_master" ]]; then Mar 17 11:13:27 crc kubenswrapper[4742]: set -o allexport Mar 17 11:13:27 crc kubenswrapper[4742]: source "/env/_master" Mar 17 11:13:27 crc kubenswrapper[4742]: set +o allexport Mar 17 11:13:27 crc kubenswrapper[4742]: fi Mar 17 11:13:27 crc kubenswrapper[4742]: Mar 17 11:13:27 crc kubenswrapper[4742]: ovn_v4_join_subnet_opt= Mar 17 11:13:27 crc kubenswrapper[4742]: if [[ "" != "" ]]; then Mar 17 11:13:27 crc kubenswrapper[4742]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 17 11:13:27 crc kubenswrapper[4742]: fi Mar 17 11:13:27 crc kubenswrapper[4742]: ovn_v6_join_subnet_opt= Mar 17 11:13:27 crc kubenswrapper[4742]: if [[ "" != "" ]]; then Mar 17 11:13:27 crc kubenswrapper[4742]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 17 11:13:27 crc kubenswrapper[4742]: fi Mar 17 11:13:27 crc kubenswrapper[4742]: Mar 17 11:13:27 crc kubenswrapper[4742]: ovn_v4_transit_switch_subnet_opt= Mar 17 11:13:27 crc kubenswrapper[4742]: if [[ "" != "" ]]; then Mar 17 11:13:27 crc kubenswrapper[4742]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 17 11:13:27 crc kubenswrapper[4742]: fi Mar 17 11:13:27 crc kubenswrapper[4742]: ovn_v6_transit_switch_subnet_opt= Mar 17 11:13:27 crc kubenswrapper[4742]: if [[ "" != "" ]]; then Mar 17 11:13:27 crc kubenswrapper[4742]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 17 11:13:27 crc kubenswrapper[4742]: fi Mar 17 11:13:27 crc kubenswrapper[4742]: Mar 17 11:13:27 crc kubenswrapper[4742]: dns_name_resolver_enabled_flag= Mar 17 11:13:27 crc kubenswrapper[4742]: if [[ "false" == "true" ]]; then Mar 17 11:13:27 crc kubenswrapper[4742]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 17 11:13:27 crc kubenswrapper[4742]: fi Mar 17 11:13:27 crc kubenswrapper[4742]: Mar 17 11:13:27 crc kubenswrapper[4742]: persistent_ips_enabled_flag= Mar 17 11:13:27 crc kubenswrapper[4742]: if [[ "true" == "true" ]]; then Mar 17 11:13:27 crc kubenswrapper[4742]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 17 11:13:27 crc kubenswrapper[4742]: fi Mar 17 11:13:27 crc kubenswrapper[4742]: Mar 17 11:13:27 crc kubenswrapper[4742]: # This is needed so that converting clusters from GA to TP Mar 17 11:13:27 crc kubenswrapper[4742]: # will rollout control plane pods as well Mar 17 11:13:27 crc kubenswrapper[4742]: network_segmentation_enabled_flag= Mar 17 11:13:27 crc kubenswrapper[4742]: multi_network_enabled_flag= Mar 17 11:13:27 crc kubenswrapper[4742]: if [[ "true" == "true" ]]; then Mar 17 11:13:27 crc kubenswrapper[4742]: multi_network_enabled_flag="--enable-multi-network" Mar 17 11:13:27 crc kubenswrapper[4742]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 17 11:13:27 crc kubenswrapper[4742]: fi Mar 17 11:13:27 crc kubenswrapper[4742]: Mar 17 11:13:27 crc kubenswrapper[4742]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 17 11:13:27 crc kubenswrapper[4742]: exec /usr/bin/ovnkube \ Mar 17 11:13:27 crc kubenswrapper[4742]: --enable-interconnect \ Mar 17 11:13:27 crc kubenswrapper[4742]: --init-cluster-manager "${K8S_NODE}" \ Mar 17 11:13:27 crc kubenswrapper[4742]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 17 11:13:27 crc kubenswrapper[4742]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 17 11:13:27 crc kubenswrapper[4742]: --metrics-bind-address "127.0.0.1:29108" \ Mar 17 11:13:27 crc kubenswrapper[4742]: --metrics-enable-pprof \ Mar 17 11:13:27 crc kubenswrapper[4742]: --metrics-enable-config-duration \ Mar 17 11:13:27 crc kubenswrapper[4742]: ${ovn_v4_join_subnet_opt} \ Mar 17 11:13:27 crc kubenswrapper[4742]: ${ovn_v6_join_subnet_opt} \ Mar 17 11:13:27 crc kubenswrapper[4742]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 17 11:13:27 crc kubenswrapper[4742]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 17 11:13:27 crc kubenswrapper[4742]: ${dns_name_resolver_enabled_flag} \ Mar 17 11:13:27 crc kubenswrapper[4742]: ${persistent_ips_enabled_flag} \ Mar 17 11:13:27 crc kubenswrapper[4742]: ${multi_network_enabled_flag} \ Mar 17 11:13:27 crc kubenswrapper[4742]: ${network_segmentation_enabled_flag} Mar 17 11:13:27 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5kmch,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-qv2v9_openshift-ovn-kubernetes(b8c53ad4-b584-48be-8055-a928c8a0178f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:27 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:27 crc kubenswrapper[4742]: E0317 11:13:27.271558 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" podUID="b8c53ad4-b584-48be-8055-a928c8a0178f" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.280453 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.292190 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.301711 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.301799 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.301813 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.301837 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.301850 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:27Z","lastTransitionTime":"2026-03-17T11:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.302106 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.315991 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.327653 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.344278 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.356033 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.365929 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.377973 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.388453 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.401970 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.404717 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.404775 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.404798 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.404827 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.404847 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:27Z","lastTransitionTime":"2026-03-17T11:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.436490 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.466063 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.481598 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.495307 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.505183 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.507557 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.507598 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.507609 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.507629 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.507641 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:27Z","lastTransitionTime":"2026-03-17T11:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.611151 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.611222 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.611232 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.611256 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.611270 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:27Z","lastTransitionTime":"2026-03-17T11:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.662461 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:27 crc kubenswrapper[4742]: E0317 11:13:27.662655 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.663235 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.663326 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.663235 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:27 crc kubenswrapper[4742]: E0317 11:13:27.663442 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:27 crc kubenswrapper[4742]: E0317 11:13:27.663659 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:27 crc kubenswrapper[4742]: E0317 11:13:27.663795 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.714336 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.714407 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.714425 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.714451 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.714468 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:27Z","lastTransitionTime":"2026-03-17T11:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.818370 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.818457 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.818477 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.818504 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.818525 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:27Z","lastTransitionTime":"2026-03-17T11:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.921777 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.921866 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.921892 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.921967 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:27 crc kubenswrapper[4742]: I0317 11:13:27.921991 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:27Z","lastTransitionTime":"2026-03-17T11:13:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.025977 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.026033 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.026054 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.026078 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.026096 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:28Z","lastTransitionTime":"2026-03-17T11:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.129739 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.129812 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.129831 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.129859 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.129879 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:28Z","lastTransitionTime":"2026-03-17T11:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.176294 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:28 crc kubenswrapper[4742]: E0317 11:13:28.176549 4742 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:28 crc kubenswrapper[4742]: E0317 11:13:28.176722 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs podName:6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:30.176685062 +0000 UTC m=+113.302813040 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs") pod "network-metrics-daemon-drnx8" (UID: "6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.233932 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.233999 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.234017 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.234042 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.234062 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:28Z","lastTransitionTime":"2026-03-17T11:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.340524 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.340606 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.340629 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.340660 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.340687 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:28Z","lastTransitionTime":"2026-03-17T11:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.444023 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.444089 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.444104 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.444125 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.444137 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:28Z","lastTransitionTime":"2026-03-17T11:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.547772 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.547830 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.547842 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.547864 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.547877 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:28Z","lastTransitionTime":"2026-03-17T11:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.651324 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.651392 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.651411 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.651442 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.651467 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:28Z","lastTransitionTime":"2026-03-17T11:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.678852 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.696644 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.718487 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.740024 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.755051 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.755132 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.755150 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.755183 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.755206 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:28Z","lastTransitionTime":"2026-03-17T11:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.771199 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.785874 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.802607 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.813057 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.826407 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.835865 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.847828 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.859440 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.859524 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.859549 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.859585 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.859610 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:28Z","lastTransitionTime":"2026-03-17T11:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.861371 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.878211 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.898064 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.909110 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.919956 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.936401 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.936456 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.936475 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.936498 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.936516 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:28Z","lastTransitionTime":"2026-03-17T11:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:28 crc kubenswrapper[4742]: E0317 11:13:28.951758 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.956152 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.956210 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.956227 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.956248 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.956264 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:28Z","lastTransitionTime":"2026-03-17T11:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:28 crc kubenswrapper[4742]: E0317 11:13:28.968071 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.972517 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.972565 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.972582 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.972604 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.972615 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:28Z","lastTransitionTime":"2026-03-17T11:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:28 crc kubenswrapper[4742]: E0317 11:13:28.983330 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.987222 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.987308 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.987325 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.987342 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:28 crc kubenswrapper[4742]: I0317 11:13:28.987356 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:28Z","lastTransitionTime":"2026-03-17T11:13:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.003395 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.007779 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.007827 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.007843 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.007926 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.007941 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:29Z","lastTransitionTime":"2026-03-17T11:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.018735 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.018865 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.020386 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.020414 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.020426 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.020441 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.020453 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:29Z","lastTransitionTime":"2026-03-17T11:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.123417 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.123531 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.123552 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.123628 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.123647 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:29Z","lastTransitionTime":"2026-03-17T11:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.227474 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.227549 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.227564 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.227592 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.227611 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:29Z","lastTransitionTime":"2026-03-17T11:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.330942 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.331036 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.331062 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.331092 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.331111 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:29Z","lastTransitionTime":"2026-03-17T11:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.434868 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.434966 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.434988 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.435014 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.435032 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:29Z","lastTransitionTime":"2026-03-17T11:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.537133 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.537184 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.537197 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.537215 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.537226 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:29Z","lastTransitionTime":"2026-03-17T11:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.640254 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.640310 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.640326 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.640348 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.640363 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:29Z","lastTransitionTime":"2026-03-17T11:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.662108 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.662332 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.662368 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.662740 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.662861 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.663046 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.663038 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.663124 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.664470 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.664668 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:29 crc kubenswrapper[4742]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 17 11:13:29 crc kubenswrapper[4742]: set -uo pipefail Mar 17 11:13:29 crc kubenswrapper[4742]: Mar 17 11:13:29 crc kubenswrapper[4742]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 17 11:13:29 crc kubenswrapper[4742]: Mar 17 11:13:29 crc kubenswrapper[4742]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 17 11:13:29 crc kubenswrapper[4742]: HOSTS_FILE="/etc/hosts" Mar 17 11:13:29 crc kubenswrapper[4742]: TEMP_FILE="/etc/hosts.tmp" Mar 17 11:13:29 crc kubenswrapper[4742]: Mar 17 11:13:29 crc kubenswrapper[4742]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 17 11:13:29 crc kubenswrapper[4742]: Mar 17 11:13:29 crc kubenswrapper[4742]: # Make a temporary file with the old hosts file's attributes. Mar 17 11:13:29 crc kubenswrapper[4742]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 17 11:13:29 crc kubenswrapper[4742]: echo "Failed to preserve hosts file. Exiting." Mar 17 11:13:29 crc kubenswrapper[4742]: exit 1 Mar 17 11:13:29 crc kubenswrapper[4742]: fi Mar 17 11:13:29 crc kubenswrapper[4742]: Mar 17 11:13:29 crc kubenswrapper[4742]: while true; do Mar 17 11:13:29 crc kubenswrapper[4742]: declare -A svc_ips Mar 17 11:13:29 crc kubenswrapper[4742]: for svc in "${services[@]}"; do Mar 17 11:13:29 crc kubenswrapper[4742]: # Fetch service IP from cluster dns if present. We make several tries Mar 17 11:13:29 crc kubenswrapper[4742]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 17 11:13:29 crc kubenswrapper[4742]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 17 11:13:29 crc kubenswrapper[4742]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 17 11:13:29 crc kubenswrapper[4742]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 11:13:29 crc kubenswrapper[4742]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 11:13:29 crc kubenswrapper[4742]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 17 11:13:29 crc kubenswrapper[4742]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 17 11:13:29 crc kubenswrapper[4742]: for i in ${!cmds[*]} Mar 17 11:13:29 crc kubenswrapper[4742]: do Mar 17 11:13:29 crc kubenswrapper[4742]: ips=($(eval "${cmds[i]}")) Mar 17 11:13:29 crc kubenswrapper[4742]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 17 11:13:29 crc kubenswrapper[4742]: svc_ips["${svc}"]="${ips[@]}" Mar 17 11:13:29 crc kubenswrapper[4742]: break Mar 17 11:13:29 crc kubenswrapper[4742]: fi Mar 17 11:13:29 crc kubenswrapper[4742]: done Mar 17 11:13:29 crc kubenswrapper[4742]: done Mar 17 11:13:29 crc kubenswrapper[4742]: Mar 17 11:13:29 crc kubenswrapper[4742]: # Update /etc/hosts only if we get valid service IPs Mar 17 11:13:29 crc kubenswrapper[4742]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 17 11:13:29 crc kubenswrapper[4742]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 17 11:13:29 crc kubenswrapper[4742]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 17 11:13:29 crc kubenswrapper[4742]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 17 11:13:29 crc kubenswrapper[4742]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 17 11:13:29 crc kubenswrapper[4742]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 17 11:13:29 crc kubenswrapper[4742]: sleep 60 & wait Mar 17 11:13:29 crc kubenswrapper[4742]: continue Mar 17 11:13:29 crc kubenswrapper[4742]: fi Mar 17 11:13:29 crc kubenswrapper[4742]: Mar 17 11:13:29 crc kubenswrapper[4742]: # Append resolver entries for services Mar 17 11:13:29 crc kubenswrapper[4742]: rc=0 Mar 17 11:13:29 crc kubenswrapper[4742]: for svc in "${!svc_ips[@]}"; do Mar 17 11:13:29 crc kubenswrapper[4742]: for ip in ${svc_ips[${svc}]}; do Mar 17 11:13:29 crc kubenswrapper[4742]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 17 11:13:29 crc kubenswrapper[4742]: done Mar 17 11:13:29 crc kubenswrapper[4742]: done Mar 17 11:13:29 crc kubenswrapper[4742]: if [[ $rc -ne 0 ]]; then Mar 17 11:13:29 crc kubenswrapper[4742]: sleep 60 & wait Mar 17 11:13:29 crc kubenswrapper[4742]: continue Mar 17 11:13:29 crc kubenswrapper[4742]: fi Mar 17 11:13:29 crc kubenswrapper[4742]: Mar 17 11:13:29 crc kubenswrapper[4742]: Mar 17 11:13:29 crc kubenswrapper[4742]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 17 11:13:29 crc kubenswrapper[4742]: # Replace /etc/hosts with our modified version if needed Mar 17 11:13:29 crc kubenswrapper[4742]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 17 11:13:29 crc kubenswrapper[4742]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 17 11:13:29 crc kubenswrapper[4742]: fi Mar 17 11:13:29 crc kubenswrapper[4742]: sleep 60 & wait Mar 17 11:13:29 crc kubenswrapper[4742]: unset svc_ips Mar 17 11:13:29 crc kubenswrapper[4742]: done Mar 17 11:13:29 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5w4nh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-kwrj5_openshift-dns(fa31fa5e-119d-4392-b5c6-8f4a488e64af): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:29 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.665084 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:29 crc kubenswrapper[4742]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 11:13:29 crc kubenswrapper[4742]: if [[ -f "/env/_master" ]]; then Mar 17 11:13:29 crc kubenswrapper[4742]: set -o allexport Mar 17 11:13:29 crc kubenswrapper[4742]: source "/env/_master" Mar 17 11:13:29 crc kubenswrapper[4742]: set +o allexport Mar 17 11:13:29 crc kubenswrapper[4742]: fi Mar 17 11:13:29 crc kubenswrapper[4742]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 17 11:13:29 crc kubenswrapper[4742]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 17 11:13:29 crc kubenswrapper[4742]: ho_enable="--enable-hybrid-overlay" Mar 17 11:13:29 crc kubenswrapper[4742]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 17 11:13:29 crc kubenswrapper[4742]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 17 11:13:29 crc kubenswrapper[4742]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 17 11:13:29 crc kubenswrapper[4742]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 17 11:13:29 crc kubenswrapper[4742]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 17 11:13:29 crc kubenswrapper[4742]: --webhook-host=127.0.0.1 \ Mar 17 11:13:29 crc kubenswrapper[4742]: --webhook-port=9743 \ Mar 17 11:13:29 crc kubenswrapper[4742]: ${ho_enable} \ Mar 17 11:13:29 crc kubenswrapper[4742]: --enable-interconnect \ Mar 17 11:13:29 crc kubenswrapper[4742]: --disable-approver \ Mar 17 11:13:29 crc kubenswrapper[4742]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 17 11:13:29 crc kubenswrapper[4742]: --wait-for-kubernetes-api=200s \ Mar 17 11:13:29 crc kubenswrapper[4742]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 17 11:13:29 crc kubenswrapper[4742]: --loglevel="${LOGLEVEL}" Mar 17 11:13:29 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:29 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.666240 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.666281 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-kwrj5" podUID="fa31fa5e-119d-4392-b5c6-8f4a488e64af" Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.667225 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:29 crc kubenswrapper[4742]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 17 11:13:29 crc kubenswrapper[4742]: if [[ -f "/env/_master" ]]; then Mar 17 11:13:29 crc kubenswrapper[4742]: set -o allexport Mar 17 11:13:29 crc kubenswrapper[4742]: source "/env/_master" Mar 17 11:13:29 crc kubenswrapper[4742]: set +o allexport Mar 17 11:13:29 crc kubenswrapper[4742]: fi Mar 17 11:13:29 crc kubenswrapper[4742]: Mar 17 11:13:29 crc kubenswrapper[4742]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 17 11:13:29 crc kubenswrapper[4742]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 17 11:13:29 crc kubenswrapper[4742]: --disable-webhook \ Mar 17 11:13:29 crc kubenswrapper[4742]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 17 11:13:29 crc kubenswrapper[4742]: --loglevel="${LOGLEVEL}" Mar 17 11:13:29 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:29 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:29 crc kubenswrapper[4742]: E0317 11:13:29.668376 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.742890 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.743002 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.743026 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.743056 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.743078 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:29Z","lastTransitionTime":"2026-03-17T11:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.846001 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.846245 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.846291 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.846327 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.846351 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:29Z","lastTransitionTime":"2026-03-17T11:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.949341 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.949428 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.949450 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.949481 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:29 crc kubenswrapper[4742]: I0317 11:13:29.949509 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:29Z","lastTransitionTime":"2026-03-17T11:13:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.052852 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.052941 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.052954 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.052974 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.052986 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:30Z","lastTransitionTime":"2026-03-17T11:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.156216 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.156302 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.156341 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.156417 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.156445 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:30Z","lastTransitionTime":"2026-03-17T11:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.195487 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:30 crc kubenswrapper[4742]: E0317 11:13:30.195669 4742 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:30 crc kubenswrapper[4742]: E0317 11:13:30.195777 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs podName:6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:34.195750119 +0000 UTC m=+117.321877917 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs") pod "network-metrics-daemon-drnx8" (UID: "6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.258798 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.258877 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.258901 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.258970 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.258995 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:30Z","lastTransitionTime":"2026-03-17T11:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.362388 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.362446 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.362463 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.362487 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.362507 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:30Z","lastTransitionTime":"2026-03-17T11:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.465546 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.465588 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.465606 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.465627 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.465640 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:30Z","lastTransitionTime":"2026-03-17T11:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.569060 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.569107 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.569117 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.569134 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.569144 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:30Z","lastTransitionTime":"2026-03-17T11:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:30 crc kubenswrapper[4742]: E0317 11:13:30.664661 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2qj9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-hcxv8_openshift-multus(a0932050-dced-4c05-b9d2-d8db1db0dceb): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 11:13:30 crc kubenswrapper[4742]: E0317 11:13:30.665313 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:30 crc kubenswrapper[4742]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 17 11:13:30 crc kubenswrapper[4742]: while [ true ]; Mar 17 11:13:30 crc kubenswrapper[4742]: do Mar 17 11:13:30 crc kubenswrapper[4742]: for f in $(ls /tmp/serviceca); do Mar 17 11:13:30 crc kubenswrapper[4742]: echo $f Mar 17 11:13:30 crc kubenswrapper[4742]: ca_file_path="/tmp/serviceca/${f}" Mar 17 11:13:30 crc kubenswrapper[4742]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 17 11:13:30 crc kubenswrapper[4742]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 17 11:13:30 crc kubenswrapper[4742]: if [ -e "${reg_dir_path}" ]; then Mar 17 11:13:30 crc kubenswrapper[4742]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 17 11:13:30 crc kubenswrapper[4742]: else Mar 17 11:13:30 crc kubenswrapper[4742]: mkdir $reg_dir_path Mar 17 11:13:30 crc kubenswrapper[4742]: cp $ca_file_path $reg_dir_path/ca.crt Mar 17 11:13:30 crc kubenswrapper[4742]: fi Mar 17 11:13:30 crc kubenswrapper[4742]: done Mar 17 11:13:30 crc kubenswrapper[4742]: for d in $(ls /etc/docker/certs.d); do Mar 17 11:13:30 crc kubenswrapper[4742]: echo $d Mar 17 11:13:30 crc kubenswrapper[4742]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 17 11:13:30 crc kubenswrapper[4742]: reg_conf_path="/tmp/serviceca/${dp}" Mar 17 11:13:30 crc kubenswrapper[4742]: if [ ! -e "${reg_conf_path}" ]; then Mar 17 11:13:30 crc kubenswrapper[4742]: rm -rf /etc/docker/certs.d/$d Mar 17 11:13:30 crc kubenswrapper[4742]: fi Mar 17 11:13:30 crc kubenswrapper[4742]: done Mar 17 11:13:30 crc kubenswrapper[4742]: sleep 60 & wait ${!} Mar 17 11:13:30 crc kubenswrapper[4742]: done Mar 17 11:13:30 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-92vc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-hv2p6_openshift-image-registry(ad7af928-88e1-468c-9471-8e7902a4a6ee): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:30 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:30 crc kubenswrapper[4742]: E0317 11:13:30.665789 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:30 crc kubenswrapper[4742]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 17 11:13:30 crc kubenswrapper[4742]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 17 11:13:30 crc kubenswrapper[4742]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4w98f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-xwmfc_openshift-multus(ff1068ee-5ebe-4575-806d-967a3b9bfb6a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:30 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:30 crc kubenswrapper[4742]: E0317 11:13:30.665954 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" podUID="a0932050-dced-4c05-b9d2-d8db1db0dceb" Mar 17 11:13:30 crc kubenswrapper[4742]: E0317 11:13:30.666759 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-hv2p6" podUID="ad7af928-88e1-468c-9471-8e7902a4a6ee" Mar 17 11:13:30 crc kubenswrapper[4742]: E0317 11:13:30.667001 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-xwmfc" podUID="ff1068ee-5ebe-4575-806d-967a3b9bfb6a" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.672174 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.672417 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.672460 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.672495 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.672518 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:30Z","lastTransitionTime":"2026-03-17T11:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.775804 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.776145 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.776168 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.776196 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.776214 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:30Z","lastTransitionTime":"2026-03-17T11:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.878300 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.878357 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.878370 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.878387 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.878399 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:30Z","lastTransitionTime":"2026-03-17T11:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.983260 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.983371 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.983430 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.983464 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:30 crc kubenswrapper[4742]: I0317 11:13:30.983528 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:30Z","lastTransitionTime":"2026-03-17T11:13:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.087588 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.087640 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.087657 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.087681 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.087698 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:31Z","lastTransitionTime":"2026-03-17T11:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.191599 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.191683 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.191709 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.191742 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.191765 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:31Z","lastTransitionTime":"2026-03-17T11:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.294002 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.294062 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.294077 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.294098 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.294115 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:31Z","lastTransitionTime":"2026-03-17T11:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.397498 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.397588 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.397612 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.397638 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.397657 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:31Z","lastTransitionTime":"2026-03-17T11:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.501068 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.502116 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.502149 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.502180 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.502198 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:31Z","lastTransitionTime":"2026-03-17T11:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.606334 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.606396 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.606416 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.606453 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.606491 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:31Z","lastTransitionTime":"2026-03-17T11:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.662990 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.663053 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.663112 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.663158 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:31 crc kubenswrapper[4742]: E0317 11:13:31.663358 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:31 crc kubenswrapper[4742]: E0317 11:13:31.663451 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:31 crc kubenswrapper[4742]: E0317 11:13:31.663588 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:31 crc kubenswrapper[4742]: E0317 11:13:31.663652 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.709757 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.709853 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.709878 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.709943 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.709969 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:31Z","lastTransitionTime":"2026-03-17T11:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.812284 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.812350 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.812367 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.812394 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.812411 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:31Z","lastTransitionTime":"2026-03-17T11:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.915378 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.915454 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.915475 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.915506 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:31 crc kubenswrapper[4742]: I0317 11:13:31.915525 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:31Z","lastTransitionTime":"2026-03-17T11:13:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.019528 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.019600 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.019621 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.019646 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.019664 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:32Z","lastTransitionTime":"2026-03-17T11:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.123244 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.123315 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.123334 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.123362 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.123382 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:32Z","lastTransitionTime":"2026-03-17T11:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.227217 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.227282 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.227293 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.227316 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.227332 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:32Z","lastTransitionTime":"2026-03-17T11:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.330643 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.330699 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.330709 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.330724 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.330737 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:32Z","lastTransitionTime":"2026-03-17T11:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.434666 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.434742 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.434760 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.434792 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.434811 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:32Z","lastTransitionTime":"2026-03-17T11:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.537593 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.537660 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.537697 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.537723 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.537743 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:32Z","lastTransitionTime":"2026-03-17T11:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.641151 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.641198 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.641211 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.641234 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.641248 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:32Z","lastTransitionTime":"2026-03-17T11:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:32 crc kubenswrapper[4742]: E0317 11:13:32.665077 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:32 crc kubenswrapper[4742]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 17 11:13:32 crc kubenswrapper[4742]: apiVersion: v1 Mar 17 11:13:32 crc kubenswrapper[4742]: clusters: Mar 17 11:13:32 crc kubenswrapper[4742]: - cluster: Mar 17 11:13:32 crc kubenswrapper[4742]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 17 11:13:32 crc kubenswrapper[4742]: server: https://api-int.crc.testing:6443 Mar 17 11:13:32 crc kubenswrapper[4742]: name: default-cluster Mar 17 11:13:32 crc kubenswrapper[4742]: contexts: Mar 17 11:13:32 crc kubenswrapper[4742]: - context: Mar 17 11:13:32 crc kubenswrapper[4742]: cluster: default-cluster Mar 17 11:13:32 crc kubenswrapper[4742]: namespace: default Mar 17 11:13:32 crc kubenswrapper[4742]: user: default-auth Mar 17 11:13:32 crc kubenswrapper[4742]: name: default-context Mar 17 11:13:32 crc kubenswrapper[4742]: current-context: default-context Mar 17 11:13:32 crc kubenswrapper[4742]: kind: Config Mar 17 11:13:32 crc kubenswrapper[4742]: preferences: {} Mar 17 11:13:32 crc kubenswrapper[4742]: users: Mar 17 11:13:32 crc kubenswrapper[4742]: - name: default-auth Mar 17 11:13:32 crc kubenswrapper[4742]: user: Mar 17 11:13:32 crc kubenswrapper[4742]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 17 11:13:32 crc kubenswrapper[4742]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 17 11:13:32 crc kubenswrapper[4742]: EOF Mar 17 11:13:32 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qkjp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:32 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:32 crc kubenswrapper[4742]: E0317 11:13:32.665651 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:13:32 crc kubenswrapper[4742]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 17 11:13:32 crc kubenswrapper[4742]: set -o allexport Mar 17 11:13:32 crc kubenswrapper[4742]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 17 11:13:32 crc kubenswrapper[4742]: source /etc/kubernetes/apiserver-url.env Mar 17 11:13:32 crc kubenswrapper[4742]: else Mar 17 11:13:32 crc kubenswrapper[4742]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 17 11:13:32 crc kubenswrapper[4742]: exit 1 Mar 17 11:13:32 crc kubenswrapper[4742]: fi Mar 17 11:13:32 crc kubenswrapper[4742]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 17 11:13:32 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 17 11:13:32 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:13:32 crc kubenswrapper[4742]: E0317 11:13:32.666482 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" Mar 17 11:13:32 crc kubenswrapper[4742]: E0317 11:13:32.666994 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.744002 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.744071 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.744087 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.744113 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.744128 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:32Z","lastTransitionTime":"2026-03-17T11:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.848075 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.848156 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.848179 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.848211 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.848235 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:32Z","lastTransitionTime":"2026-03-17T11:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.951598 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.951677 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.951702 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.951733 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:32 crc kubenswrapper[4742]: I0317 11:13:32.951753 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:32Z","lastTransitionTime":"2026-03-17T11:13:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.055394 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.055426 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.055434 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.055448 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.055456 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:33Z","lastTransitionTime":"2026-03-17T11:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.159232 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.159291 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.159302 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.159321 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.159336 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:33Z","lastTransitionTime":"2026-03-17T11:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.262510 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.262644 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.262674 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.262709 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.262734 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:33Z","lastTransitionTime":"2026-03-17T11:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.366330 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.366387 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.366406 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.366430 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.366448 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:33Z","lastTransitionTime":"2026-03-17T11:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.432276 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.432510 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.432610 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:13:49.432556026 +0000 UTC m=+132.558683954 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.432662 4742 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.432744 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.432774 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:49.432739361 +0000 UTC m=+132.558867299 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.432823 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.432884 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.432966 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.432991 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.433005 4742 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.433050 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:49.433041429 +0000 UTC m=+132.559169397 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.433093 4742 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.433147 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.433188 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:49.433162012 +0000 UTC m=+132.559289910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.433193 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.433239 4742 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.433304 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:49.433286126 +0000 UTC m=+132.559413944 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.471200 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.471305 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.471320 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.471346 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.471364 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:33Z","lastTransitionTime":"2026-03-17T11:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.575489 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.575579 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.575598 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.575626 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.575645 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:33Z","lastTransitionTime":"2026-03-17T11:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.662198 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.662313 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.662240 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.662445 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.662403 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.662645 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.663190 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.663439 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.665832 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vmpzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.668530 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vmpzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 17 11:13:33 crc kubenswrapper[4742]: E0317 11:13:33.669794 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.678544 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.678602 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.678615 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.678637 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.678653 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:33Z","lastTransitionTime":"2026-03-17T11:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.782399 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.782494 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.782530 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.782564 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.782586 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:33Z","lastTransitionTime":"2026-03-17T11:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.885725 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.885777 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.885788 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.885804 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.885816 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:33Z","lastTransitionTime":"2026-03-17T11:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.989190 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.989248 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.989265 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.989290 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:33 crc kubenswrapper[4742]: I0317 11:13:33.989308 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:33Z","lastTransitionTime":"2026-03-17T11:13:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.092139 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.092207 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.092219 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.092241 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.092253 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:34Z","lastTransitionTime":"2026-03-17T11:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.196011 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.196082 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.196099 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.196141 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.196155 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:34Z","lastTransitionTime":"2026-03-17T11:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.243170 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:34 crc kubenswrapper[4742]: E0317 11:13:34.243381 4742 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:34 crc kubenswrapper[4742]: E0317 11:13:34.243491 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs podName:6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:42.243460539 +0000 UTC m=+125.369588337 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs") pod "network-metrics-daemon-drnx8" (UID: "6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.299276 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.299351 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.299380 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.299416 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.299440 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:34Z","lastTransitionTime":"2026-03-17T11:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.402757 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.402819 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.402831 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.402852 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.402868 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:34Z","lastTransitionTime":"2026-03-17T11:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.505757 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.505810 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.505819 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.505836 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.505847 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:34Z","lastTransitionTime":"2026-03-17T11:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.609613 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.609681 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.609700 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.609726 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.609744 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:34Z","lastTransitionTime":"2026-03-17T11:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.713176 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.713254 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.713273 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.713298 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.713315 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:34Z","lastTransitionTime":"2026-03-17T11:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.816653 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.816717 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.816736 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.816762 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.816781 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:34Z","lastTransitionTime":"2026-03-17T11:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.920201 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.920258 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.920270 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.920290 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:34 crc kubenswrapper[4742]: I0317 11:13:34.920303 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:34Z","lastTransitionTime":"2026-03-17T11:13:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.023709 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.023797 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.023825 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.023860 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.023888 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:35Z","lastTransitionTime":"2026-03-17T11:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.127163 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.127247 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.127270 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.127300 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.127321 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:35Z","lastTransitionTime":"2026-03-17T11:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.230431 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.230485 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.230495 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.230511 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.230520 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:35Z","lastTransitionTime":"2026-03-17T11:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.334001 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.334074 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.334090 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.334114 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.334134 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:35Z","lastTransitionTime":"2026-03-17T11:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.437144 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.437208 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.437219 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.437239 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.437253 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:35Z","lastTransitionTime":"2026-03-17T11:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.539949 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.540004 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.540015 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.540032 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.540042 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:35Z","lastTransitionTime":"2026-03-17T11:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.643330 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.643393 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.643414 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.643437 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.643454 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:35Z","lastTransitionTime":"2026-03-17T11:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.661869 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.661943 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.662024 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.662131 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:35 crc kubenswrapper[4742]: E0317 11:13:35.662134 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:35 crc kubenswrapper[4742]: E0317 11:13:35.662268 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:35 crc kubenswrapper[4742]: E0317 11:13:35.662361 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:35 crc kubenswrapper[4742]: E0317 11:13:35.662587 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.746586 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.746636 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.746650 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.746669 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.746680 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:35Z","lastTransitionTime":"2026-03-17T11:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.849241 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.849286 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.849296 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.849313 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.849322 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:35Z","lastTransitionTime":"2026-03-17T11:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.952873 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.952954 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.952965 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.952981 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:35 crc kubenswrapper[4742]: I0317 11:13:35.952994 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:35Z","lastTransitionTime":"2026-03-17T11:13:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.056604 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.057410 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.057442 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.057500 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.057522 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:36Z","lastTransitionTime":"2026-03-17T11:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.165429 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.165877 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.166020 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.166115 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.166197 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:36Z","lastTransitionTime":"2026-03-17T11:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.270149 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.270205 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.270219 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.270238 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.270251 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:36Z","lastTransitionTime":"2026-03-17T11:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.373875 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.374221 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.374297 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.374374 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.374444 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:36Z","lastTransitionTime":"2026-03-17T11:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.477363 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.477483 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.477512 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.477546 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.477573 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:36Z","lastTransitionTime":"2026-03-17T11:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.506178 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.517022 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.550591 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.565953 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.577415 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.580106 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.580141 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.580151 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.580167 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.580176 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:36Z","lastTransitionTime":"2026-03-17T11:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.593010 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.608457 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.616227 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.624533 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.630345 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.639713 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.658109 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.667687 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.676785 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.682197 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.682237 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.682247 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.682262 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.682272 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:36Z","lastTransitionTime":"2026-03-17T11:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.687480 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.696391 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.713621 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.785412 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.785448 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.785456 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.785473 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.785483 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:36Z","lastTransitionTime":"2026-03-17T11:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.888227 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.888263 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.888272 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.888286 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.888294 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:36Z","lastTransitionTime":"2026-03-17T11:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.991356 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.991437 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.991463 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.991495 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:36 crc kubenswrapper[4742]: I0317 11:13:36.991513 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:36Z","lastTransitionTime":"2026-03-17T11:13:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.094630 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.094710 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.094731 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.094760 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.094779 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:37Z","lastTransitionTime":"2026-03-17T11:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.196956 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.197240 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.197253 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.197276 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.197289 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:37Z","lastTransitionTime":"2026-03-17T11:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.300149 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.300249 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.300277 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.300316 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.300343 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:37Z","lastTransitionTime":"2026-03-17T11:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.403660 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.403730 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.403749 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.403777 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.403801 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:37Z","lastTransitionTime":"2026-03-17T11:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.507335 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.507414 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.507433 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.507461 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.507479 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:37Z","lastTransitionTime":"2026-03-17T11:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.610727 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.610794 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.610818 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.610849 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.610871 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:37Z","lastTransitionTime":"2026-03-17T11:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.662403 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.662483 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.662537 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:37 crc kubenswrapper[4742]: E0317 11:13:37.662664 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.662712 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:37 crc kubenswrapper[4742]: E0317 11:13:37.662828 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:37 crc kubenswrapper[4742]: E0317 11:13:37.663626 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:37 crc kubenswrapper[4742]: E0317 11:13:37.663962 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.714485 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.714557 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.714576 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.714604 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.714624 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:37Z","lastTransitionTime":"2026-03-17T11:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.818181 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.818333 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.818362 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.818450 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.818485 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:37Z","lastTransitionTime":"2026-03-17T11:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.922735 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.922821 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.922849 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.922886 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:37 crc kubenswrapper[4742]: I0317 11:13:37.922958 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:37Z","lastTransitionTime":"2026-03-17T11:13:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.026129 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.026219 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.026237 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.026264 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.026283 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:38Z","lastTransitionTime":"2026-03-17T11:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.128878 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.129001 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.129019 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.129044 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.129064 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:38Z","lastTransitionTime":"2026-03-17T11:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.232422 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.232471 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.232485 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.232503 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.232514 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:38Z","lastTransitionTime":"2026-03-17T11:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.335255 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.335316 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.335330 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.335351 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.335364 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:38Z","lastTransitionTime":"2026-03-17T11:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.438461 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.438552 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.438577 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.438611 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.438635 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:38Z","lastTransitionTime":"2026-03-17T11:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.541853 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.541989 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.542009 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.542034 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.542052 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:38Z","lastTransitionTime":"2026-03-17T11:13:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:38 crc kubenswrapper[4742]: E0317 11:13:38.642304 4742 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.674958 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.693316 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.703213 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.713720 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.726339 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.737183 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.746654 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.763320 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.781952 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.791497 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.803831 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.814684 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.825073 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.831771 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.839573 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: I0317 11:13:38.852040 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:38 crc kubenswrapper[4742]: E0317 11:13:38.880952 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.274988 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.275040 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.275057 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.275081 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.275100 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:39Z","lastTransitionTime":"2026-03-17T11:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:39 crc kubenswrapper[4742]: E0317 11:13:39.292110 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.297853 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.297882 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.297889 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.297903 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.297924 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:39Z","lastTransitionTime":"2026-03-17T11:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:39 crc kubenswrapper[4742]: E0317 11:13:39.311752 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.316930 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.316967 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.316977 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.316991 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.317002 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:39Z","lastTransitionTime":"2026-03-17T11:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:39 crc kubenswrapper[4742]: E0317 11:13:39.330988 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.336495 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.336586 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.336637 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.336663 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.336726 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:39Z","lastTransitionTime":"2026-03-17T11:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:39 crc kubenswrapper[4742]: E0317 11:13:39.354087 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.359786 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.359857 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.359876 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.359929 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.359948 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:39Z","lastTransitionTime":"2026-03-17T11:13:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:39 crc kubenswrapper[4742]: E0317 11:13:39.375256 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:39 crc kubenswrapper[4742]: E0317 11:13:39.375420 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.440541 4742 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.662824 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.662868 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.662848 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:39 crc kubenswrapper[4742]: I0317 11:13:39.662844 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:39 crc kubenswrapper[4742]: E0317 11:13:39.662998 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:39 crc kubenswrapper[4742]: E0317 11:13:39.663163 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:39 crc kubenswrapper[4742]: E0317 11:13:39.663254 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:39 crc kubenswrapper[4742]: E0317 11:13:39.663327 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.311647 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kwrj5" event={"ID":"fa31fa5e-119d-4392-b5c6-8f4a488e64af","Type":"ContainerStarted","Data":"c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7"} Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.313626 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" event={"ID":"b8c53ad4-b584-48be-8055-a928c8a0178f","Type":"ContainerStarted","Data":"4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf"} Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.313695 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" event={"ID":"b8c53ad4-b584-48be-8055-a928c8a0178f","Type":"ContainerStarted","Data":"a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646"} Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.329594 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.345527 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.360513 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.387983 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.400572 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.423598 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.441010 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.456576 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.468828 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.480458 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.490318 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.498687 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.506552 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.514348 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.524379 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.532989 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.542507 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.555746 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.565151 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.576770 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.585174 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.597276 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.616584 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.638757 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.662323 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.662332 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.662432 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.662488 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:41 crc kubenswrapper[4742]: E0317 11:13:41.662599 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.662675 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:41 crc kubenswrapper[4742]: E0317 11:13:41.663160 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:41 crc kubenswrapper[4742]: E0317 11:13:41.663275 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:41 crc kubenswrapper[4742]: E0317 11:13:41.663399 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.686423 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.702501 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.748173 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.763054 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.782324 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.792318 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:41 crc kubenswrapper[4742]: I0317 11:13:41.801555 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.317946 4742 generic.go:334] "Generic (PLEG): container finished" podID="a0932050-dced-4c05-b9d2-d8db1db0dceb" containerID="bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585" exitCode=0 Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.318009 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" event={"ID":"a0932050-dced-4c05-b9d2-d8db1db0dceb","Type":"ContainerDied","Data":"bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585"} Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.329088 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.335653 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:42 crc kubenswrapper[4742]: E0317 11:13:42.336020 4742 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:42 crc kubenswrapper[4742]: E0317 11:13:42.336140 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs podName:6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14 nodeName:}" failed. No retries permitted until 2026-03-17 11:13:58.336111331 +0000 UTC m=+141.462239139 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs") pod "network-metrics-daemon-drnx8" (UID: "6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.340302 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.349163 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.360304 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.378540 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.388575 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.398499 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.423573 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.443877 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.458939 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.468735 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.476102 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.490243 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.498411 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.511819 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:42 crc kubenswrapper[4742]: I0317 11:13:42.527402 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.324652 4742 generic.go:334] "Generic (PLEG): container finished" podID="a0932050-dced-4c05-b9d2-d8db1db0dceb" containerID="9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280" exitCode=0 Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.324715 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" event={"ID":"a0932050-dced-4c05-b9d2-d8db1db0dceb","Type":"ContainerDied","Data":"9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280"} Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.346492 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.360063 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.370164 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.381782 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.394596 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.411337 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.424316 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.435434 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.446112 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.455143 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.463444 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.479593 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.496710 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.506829 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.517514 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.525721 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.662104 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.662248 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.662281 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:43 crc kubenswrapper[4742]: I0317 11:13:43.662283 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:43 crc kubenswrapper[4742]: E0317 11:13:43.663185 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:43 crc kubenswrapper[4742]: E0317 11:13:43.663324 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:43 crc kubenswrapper[4742]: E0317 11:13:43.663445 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:43 crc kubenswrapper[4742]: E0317 11:13:43.663525 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:43 crc kubenswrapper[4742]: E0317 11:13:43.882729 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.331295 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635"} Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.331380 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee"} Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.335715 4742 generic.go:334] "Generic (PLEG): container finished" podID="a0932050-dced-4c05-b9d2-d8db1db0dceb" containerID="e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06" exitCode=0 Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.335758 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" event={"ID":"a0932050-dced-4c05-b9d2-d8db1db0dceb","Type":"ContainerDied","Data":"e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06"} Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.349473 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.366092 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.387435 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.404982 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.430428 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.450950 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.467133 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.478379 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.494008 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.506984 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.523567 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.538193 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.551512 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.568157 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.585829 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.599320 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.614227 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.627527 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.638084 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.649299 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.659899 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.675818 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.688242 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.707492 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.728051 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.748440 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.764820 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.778779 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.795802 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.808198 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.826449 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:44 crc kubenswrapper[4742]: I0317 11:13:44.842116 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:44Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.345755 4742 generic.go:334] "Generic (PLEG): container finished" podID="a0932050-dced-4c05-b9d2-d8db1db0dceb" containerID="900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5" exitCode=0 Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.345839 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" event={"ID":"a0932050-dced-4c05-b9d2-d8db1db0dceb","Type":"ContainerDied","Data":"900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5"} Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.350892 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d"} Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.350983 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092"} Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.353959 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074"} Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.373797 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.393361 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.414261 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.440693 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.469984 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.488086 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.505615 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.521760 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.538886 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.550667 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.563704 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.581459 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.600549 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.618712 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.630741 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.642074 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.659487 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.662129 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.662195 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.662252 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.662486 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:45 crc kubenswrapper[4742]: E0317 11:13:45.662410 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:45 crc kubenswrapper[4742]: E0317 11:13:45.662694 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:45 crc kubenswrapper[4742]: E0317 11:13:45.662786 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:45 crc kubenswrapper[4742]: E0317 11:13:45.663378 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.676876 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.693232 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.710195 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.727260 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.746844 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.768035 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.781550 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.798551 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.812089 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.824325 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.842526 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.851023 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.868629 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.883548 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:45 crc kubenswrapper[4742]: I0317 11:13:45.893213 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:45Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.361078 4742 generic.go:334] "Generic (PLEG): container finished" podID="a0932050-dced-4c05-b9d2-d8db1db0dceb" containerID="cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3" exitCode=0 Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.361143 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" event={"ID":"a0932050-dced-4c05-b9d2-d8db1db0dceb","Type":"ContainerDied","Data":"cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3"} Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.382441 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.417161 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.429510 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.445107 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.459438 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.482284 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.495839 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.521671 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.556051 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.582777 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.602057 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.613501 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.629431 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.641640 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.652024 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:46 crc kubenswrapper[4742]: I0317 11:13:46.667196 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:46Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.370304 4742 generic.go:334] "Generic (PLEG): container finished" podID="a0932050-dced-4c05-b9d2-d8db1db0dceb" containerID="0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398" exitCode=0 Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.370402 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" event={"ID":"a0932050-dced-4c05-b9d2-d8db1db0dceb","Type":"ContainerDied","Data":"0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398"} Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.372858 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwmfc" event={"ID":"ff1068ee-5ebe-4575-806d-967a3b9bfb6a","Type":"ContainerStarted","Data":"e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622"} Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.375045 4742 generic.go:334] "Generic (PLEG): container finished" podID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerID="8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38" exitCode=0 Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.375121 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38"} Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.377989 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hv2p6" event={"ID":"ad7af928-88e1-468c-9471-8e7902a4a6ee","Type":"ContainerStarted","Data":"078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e"} Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.380124 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402"} Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.384868 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.408350 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.420451 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.440225 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.460453 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.475587 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.488445 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.502063 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.515704 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.526016 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.538808 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.552428 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.566975 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.583820 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.596435 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.607851 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.621468 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.632777 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.646342 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.662047 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.662107 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.662117 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.662059 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:47 crc kubenswrapper[4742]: E0317 11:13:47.662165 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:47 crc kubenswrapper[4742]: E0317 11:13:47.662291 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:47 crc kubenswrapper[4742]: E0317 11:13:47.662342 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:47 crc kubenswrapper[4742]: E0317 11:13:47.662423 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.673310 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.687096 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.704053 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.717154 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.747729 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.766845 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.777410 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.787479 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.804086 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.817346 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.835378 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.850796 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:47 crc kubenswrapper[4742]: I0317 11:13:47.862977 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:47Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.389208 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerStarted","Data":"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0"} Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.389285 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerStarted","Data":"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425"} Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.389316 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerStarted","Data":"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397"} Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.389337 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerStarted","Data":"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33"} Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.389357 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerStarted","Data":"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2"} Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.389379 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerStarted","Data":"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545"} Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.393755 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" event={"ID":"a0932050-dced-4c05-b9d2-d8db1db0dceb","Type":"ContainerStarted","Data":"4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602"} Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.408707 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.427322 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.442332 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.460865 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.481122 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.493242 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.503575 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.516280 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.531627 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.543523 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.554085 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.569008 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.580958 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.598933 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.609293 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.619309 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.679728 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.697745 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.711945 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.725468 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.738731 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.759743 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.778053 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.791077 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.806538 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.824481 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.837491 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.854536 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.871569 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: E0317 11:13:48.883290 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.890432 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.905225 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:48 crc kubenswrapper[4742]: I0317 11:13:48.914335 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.521675 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.521823 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.521948 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:14:21.521882434 +0000 UTC m=+164.648010222 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.522037 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.522087 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.522089 4742 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.522240 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:14:21.522207333 +0000 UTC m=+164.648335111 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.522260 4742 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.522337 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:14:21.522322928 +0000 UTC m=+164.648450716 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.522454 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.522486 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.522513 4742 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.522570 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 11:14:21.522548835 +0000 UTC m=+164.648676633 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.522670 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.522694 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.522714 4742 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.522809 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 11:14:21.522789152 +0000 UTC m=+164.648917140 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.522129 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.662371 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.662562 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.662370 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.662689 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.662867 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.663032 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.663147 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.663393 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.716776 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.716847 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.716870 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.716899 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.716948 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:49Z","lastTransitionTime":"2026-03-17T11:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.737649 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:49Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.742208 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.742424 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.742553 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.742701 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.742832 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:49Z","lastTransitionTime":"2026-03-17T11:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.758302 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:49Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.762453 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.762627 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.762747 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.762902 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.763138 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:49Z","lastTransitionTime":"2026-03-17T11:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.784637 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:49Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.789538 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.789579 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.789591 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.789609 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.789621 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:49Z","lastTransitionTime":"2026-03-17T11:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.807974 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:49Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.812652 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.812708 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.812722 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.812743 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:13:49 crc kubenswrapper[4742]: I0317 11:13:49.812759 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:13:49Z","lastTransitionTime":"2026-03-17T11:13:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.828387 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:49Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:49 crc kubenswrapper[4742]: E0317 11:13:49.828497 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:13:50 crc kubenswrapper[4742]: I0317 11:13:50.406427 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerStarted","Data":"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65"} Mar 17 11:13:51 crc kubenswrapper[4742]: I0317 11:13:51.662238 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:51 crc kubenswrapper[4742]: I0317 11:13:51.662248 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:51 crc kubenswrapper[4742]: E0317 11:13:51.662824 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:51 crc kubenswrapper[4742]: I0317 11:13:51.662316 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:51 crc kubenswrapper[4742]: I0317 11:13:51.662289 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:51 crc kubenswrapper[4742]: E0317 11:13:51.662966 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:51 crc kubenswrapper[4742]: E0317 11:13:51.663035 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:51 crc kubenswrapper[4742]: E0317 11:13:51.663306 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.424310 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerStarted","Data":"d9ae5c289c3da9700dc4768431ed4fa91e0dc602cebfd45c5e304b0acde0c591"} Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.424500 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.424512 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.424700 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.453491 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.456155 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.459875 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.477073 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.493085 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.506555 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.520347 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.532900 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.548524 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.569991 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.587311 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.608026 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.621163 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.636143 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.653585 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.662166 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.662264 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.662355 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.662473 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:53 crc kubenswrapper[4742]: E0317 11:13:53.662473 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:53 crc kubenswrapper[4742]: E0317 11:13:53.662659 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:53 crc kubenswrapper[4742]: E0317 11:13:53.662763 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:53 crc kubenswrapper[4742]: E0317 11:13:53.662858 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.670428 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.684391 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.708640 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ae5c289c3da9700dc4768431ed4fa91e0dc602cebfd45c5e304b0acde0c591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.724544 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.737290 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.752051 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.765332 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.779603 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.809826 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.834129 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.849780 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.861048 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.870932 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: E0317 11:13:53.884205 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.888385 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ae5c289c3da9700dc4768431ed4fa91e0dc602cebfd45c5e304b0acde0c591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.899302 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.918401 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.932862 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.947731 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:53 crc kubenswrapper[4742]: I0317 11:13:53.958899 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:53Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.438394 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/0.log" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.441505 4742 generic.go:334] "Generic (PLEG): container finished" podID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerID="d9ae5c289c3da9700dc4768431ed4fa91e0dc602cebfd45c5e304b0acde0c591" exitCode=1 Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.441553 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"d9ae5c289c3da9700dc4768431ed4fa91e0dc602cebfd45c5e304b0acde0c591"} Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.442699 4742 scope.go:117] "RemoveContainer" containerID="d9ae5c289c3da9700dc4768431ed4fa91e0dc602cebfd45c5e304b0acde0c591" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.458123 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.480344 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.496144 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.508740 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.521126 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.534649 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.552412 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.572031 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.589252 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ae5c289c3da9700dc4768431ed4fa91e0dc602cebfd45c5e304b0acde0c591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ae5c289c3da9700dc4768431ed4fa91e0dc602cebfd45c5e304b0acde0c591\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:13:54Z\\\",\\\"message\\\":\\\" removal\\\\nI0317 11:13:54.838728 6767 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 11:13:54.838737 6767 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:13:54.838747 6767 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:13:54.838759 6767 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:13:54.838763 6767 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 11:13:54.838718 6767 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:13:54.838780 6767 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0317 11:13:54.838794 6767 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:13:54.838798 6767 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 11:13:54.838818 6767 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 11:13:54.838823 6767 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0317 11:13:54.838816 6767 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 11:13:54.838854 6767 factory.go:656] Stopping watch factory\\\\nI0317 11:13:54.838871 6767 ovnkube.go:599] Stopped ovnkube\\\\nI0317 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.605775 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.618032 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.630007 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.642101 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.651665 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.662592 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:55 crc kubenswrapper[4742]: E0317 11:13:55.662905 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.663110 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:55 crc kubenswrapper[4742]: E0317 11:13:55.663171 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.663405 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:55 crc kubenswrapper[4742]: E0317 11:13:55.663462 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.663496 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:55 crc kubenswrapper[4742]: E0317 11:13:55.663534 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.678138 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:55 crc kubenswrapper[4742]: I0317 11:13:55.689469 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:55Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.447766 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/1.log" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.448743 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/0.log" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.452197 4742 generic.go:334] "Generic (PLEG): container finished" podID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerID="2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d" exitCode=1 Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.452258 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d"} Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.452351 4742 scope.go:117] "RemoveContainer" containerID="d9ae5c289c3da9700dc4768431ed4fa91e0dc602cebfd45c5e304b0acde0c591" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.453571 4742 scope.go:117] "RemoveContainer" containerID="2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d" Mar 17 11:13:56 crc kubenswrapper[4742]: E0317 11:13:56.453832 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.471671 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.490586 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.505861 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.535549 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ae5c289c3da9700dc4768431ed4fa91e0dc602cebfd45c5e304b0acde0c591\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:13:54Z\\\",\\\"message\\\":\\\" removal\\\\nI0317 11:13:54.838728 6767 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 11:13:54.838737 6767 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:13:54.838747 6767 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:13:54.838759 6767 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:13:54.838763 6767 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 11:13:54.838718 6767 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:13:54.838780 6767 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0317 11:13:54.838794 6767 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:13:54.838798 6767 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 11:13:54.838818 6767 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 11:13:54.838823 6767 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0317 11:13:54.838816 6767 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 11:13:54.838854 6767 factory.go:656] Stopping watch factory\\\\nI0317 11:13:54.838871 6767 ovnkube.go:599] Stopped ovnkube\\\\nI0317 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:13:56Z\\\",\\\"message\\\":\\\"3:56.394297 6904 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 11:13:56.394303 6904 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:13:56.394325 6904 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 11:13:56.394332 6904 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 11:13:56.394344 6904 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0317 11:13:56.394409 6904 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:13:56.394499 6904 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 11:13:56.394538 6904 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 11:13:56.394584 6904 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 11:13:56.394608 6904 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 11:13:56.394648 6904 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 11:13:56.394630 6904 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0317 11:13:56.394695 6904 factory.go:656] Stopping watch factory\\\\nI0317 11:13:56.394714 6904 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0317 11:13:56.394717 6904 ovnkube.go:599] Stopped ovnkube\\\\nI0317 11:13:56.394700 6904 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 11:13:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.561544 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.575147 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.587211 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.596611 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.614468 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.629028 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.644073 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.661059 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.680653 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.702713 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.715318 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:56 crc kubenswrapper[4742]: I0317 11:13:56.728677 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:56Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.458593 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/1.log" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.464730 4742 scope.go:117] "RemoveContainer" containerID="2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d" Mar 17 11:13:57 crc kubenswrapper[4742]: E0317 11:13:57.464948 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.477348 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.493478 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.507232 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.540173 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:13:56Z\\\",\\\"message\\\":\\\"3:56.394297 6904 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 11:13:56.394303 6904 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:13:56.394325 6904 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 11:13:56.394332 6904 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 11:13:56.394344 6904 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0317 11:13:56.394409 6904 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:13:56.394499 6904 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 11:13:56.394538 6904 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 11:13:56.394584 6904 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 11:13:56.394608 6904 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 11:13:56.394648 6904 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 11:13:56.394630 6904 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0317 11:13:56.394695 6904 factory.go:656] Stopping watch factory\\\\nI0317 11:13:56.394714 6904 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0317 11:13:56.394717 6904 ovnkube.go:599] Stopped ovnkube\\\\nI0317 11:13:56.394700 6904 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 11:13:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.576771 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.597112 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.619511 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.638401 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.661975 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.662091 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.662103 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.662105 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:57 crc kubenswrapper[4742]: E0317 11:13:57.662261 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:13:57 crc kubenswrapper[4742]: E0317 11:13:57.662400 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:57 crc kubenswrapper[4742]: E0317 11:13:57.662523 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:57 crc kubenswrapper[4742]: E0317 11:13:57.662610 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.662980 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.682472 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.703966 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.723587 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.743504 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.762838 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.781212 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:57 crc kubenswrapper[4742]: I0317 11:13:57.799287 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:57Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.423957 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:58 crc kubenswrapper[4742]: E0317 11:13:58.424138 4742 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:58 crc kubenswrapper[4742]: E0317 11:13:58.424210 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs podName:6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14 nodeName:}" failed. No retries permitted until 2026-03-17 11:14:30.424190074 +0000 UTC m=+173.550317842 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs") pod "network-metrics-daemon-drnx8" (UID: "6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.683744 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.698323 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.709678 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.732584 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:13:56Z\\\",\\\"message\\\":\\\"3:56.394297 6904 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 11:13:56.394303 6904 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:13:56.394325 6904 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 11:13:56.394332 6904 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 11:13:56.394344 6904 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0317 11:13:56.394409 6904 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:13:56.394499 6904 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 11:13:56.394538 6904 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 11:13:56.394584 6904 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 11:13:56.394608 6904 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 11:13:56.394648 6904 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 11:13:56.394630 6904 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0317 11:13:56.394695 6904 factory.go:656] Stopping watch factory\\\\nI0317 11:13:56.394714 6904 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0317 11:13:56.394717 6904 ovnkube.go:599] Stopped ovnkube\\\\nI0317 11:13:56.394700 6904 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 11:13:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.752644 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.765215 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.776602 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.787339 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.802443 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.813434 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.825187 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.838275 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.851265 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.874702 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: E0317 11:13:58.884535 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.888864 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:58 crc kubenswrapper[4742]: I0317 11:13:58.900638 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:13:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:13:59 crc kubenswrapper[4742]: I0317 11:13:59.662678 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:13:59 crc kubenswrapper[4742]: I0317 11:13:59.662729 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:13:59 crc kubenswrapper[4742]: I0317 11:13:59.662805 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:13:59 crc kubenswrapper[4742]: I0317 11:13:59.662872 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:13:59 crc kubenswrapper[4742]: E0317 11:13:59.663186 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:13:59 crc kubenswrapper[4742]: E0317 11:13:59.663309 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:13:59 crc kubenswrapper[4742]: E0317 11:13:59.663486 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:13:59 crc kubenswrapper[4742]: E0317 11:13:59.663647 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.074172 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.074217 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.074227 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.074242 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.074252 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:00Z","lastTransitionTime":"2026-03-17T11:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:00 crc kubenswrapper[4742]: E0317 11:14:00.090449 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:00Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.094555 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.094613 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.094626 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.094653 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.094673 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:00Z","lastTransitionTime":"2026-03-17T11:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:00 crc kubenswrapper[4742]: E0317 11:14:00.112953 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:00Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.118581 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.118632 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.118646 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.118667 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.118676 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:00Z","lastTransitionTime":"2026-03-17T11:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:00 crc kubenswrapper[4742]: E0317 11:14:00.137764 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:00Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.144069 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.144151 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.144174 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.144203 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.144227 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:00Z","lastTransitionTime":"2026-03-17T11:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:00 crc kubenswrapper[4742]: E0317 11:14:00.161856 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:00Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.166284 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.166358 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.166377 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.166406 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:00 crc kubenswrapper[4742]: I0317 11:14:00.166428 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:00Z","lastTransitionTime":"2026-03-17T11:14:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:00 crc kubenswrapper[4742]: E0317 11:14:00.184026 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:00Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:00 crc kubenswrapper[4742]: E0317 11:14:00.184141 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:14:01 crc kubenswrapper[4742]: I0317 11:14:01.662111 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:01 crc kubenswrapper[4742]: I0317 11:14:01.662165 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:01 crc kubenswrapper[4742]: I0317 11:14:01.662160 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:01 crc kubenswrapper[4742]: I0317 11:14:01.662262 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:01 crc kubenswrapper[4742]: E0317 11:14:01.662386 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:01 crc kubenswrapper[4742]: E0317 11:14:01.662456 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:01 crc kubenswrapper[4742]: E0317 11:14:01.662586 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:01 crc kubenswrapper[4742]: E0317 11:14:01.662893 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:01 crc kubenswrapper[4742]: I0317 11:14:01.675643 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 17 11:14:03 crc kubenswrapper[4742]: I0317 11:14:03.662864 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:03 crc kubenswrapper[4742]: I0317 11:14:03.662892 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:03 crc kubenswrapper[4742]: I0317 11:14:03.662982 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:03 crc kubenswrapper[4742]: E0317 11:14:03.663071 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:03 crc kubenswrapper[4742]: I0317 11:14:03.663096 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:03 crc kubenswrapper[4742]: E0317 11:14:03.663278 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:03 crc kubenswrapper[4742]: E0317 11:14:03.663315 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:03 crc kubenswrapper[4742]: E0317 11:14:03.663629 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:03 crc kubenswrapper[4742]: I0317 11:14:03.677721 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 17 11:14:03 crc kubenswrapper[4742]: E0317 11:14:03.885809 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:14:05 crc kubenswrapper[4742]: I0317 11:14:05.661956 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:05 crc kubenswrapper[4742]: I0317 11:14:05.661977 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:05 crc kubenswrapper[4742]: I0317 11:14:05.661966 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:05 crc kubenswrapper[4742]: I0317 11:14:05.661956 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:05 crc kubenswrapper[4742]: E0317 11:14:05.662104 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:05 crc kubenswrapper[4742]: E0317 11:14:05.662263 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:05 crc kubenswrapper[4742]: E0317 11:14:05.662361 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:05 crc kubenswrapper[4742]: E0317 11:14:05.662399 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:07 crc kubenswrapper[4742]: I0317 11:14:07.662400 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:07 crc kubenswrapper[4742]: I0317 11:14:07.662454 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:07 crc kubenswrapper[4742]: I0317 11:14:07.662404 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:07 crc kubenswrapper[4742]: I0317 11:14:07.662404 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:07 crc kubenswrapper[4742]: E0317 11:14:07.662660 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:07 crc kubenswrapper[4742]: E0317 11:14:07.662741 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:07 crc kubenswrapper[4742]: E0317 11:14:07.662834 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:07 crc kubenswrapper[4742]: E0317 11:14:07.663025 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.686172 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.702961 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.717835 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.732209 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.746465 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5929c1f-8c88-4de7-bdf8-697bcc72db2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.762258 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.784521 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.798823 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.810808 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.826362 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.839880 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.854436 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: E0317 11:14:08.886417 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.887803 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:13:56Z\\\",\\\"message\\\":\\\"3:56.394297 6904 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 11:13:56.394303 6904 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:13:56.394325 6904 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 11:13:56.394332 6904 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 11:13:56.394344 6904 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0317 11:13:56.394409 6904 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:13:56.394499 6904 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 11:13:56.394538 6904 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 11:13:56.394584 6904 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 11:13:56.394608 6904 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 11:13:56.394648 6904 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 11:13:56.394630 6904 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0317 11:13:56.394695 6904 factory.go:656] Stopping watch factory\\\\nI0317 11:13:56.394714 6904 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0317 11:13:56.394717 6904 ovnkube.go:599] Stopped ovnkube\\\\nI0317 11:13:56.394700 6904 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 11:13:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.907253 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19e16e08-f79a-4053-ae9b-1712b1502658\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 11:12:06.794873 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 11:12:06.796424 1 observer_polling.go:159] Starting file observer\\\\nI0317 11:12:06.799354 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 11:12:06.800598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0317 11:12:36.358933 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0317 11:12:36.359068 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.928497 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.941354 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.954974 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:08 crc kubenswrapper[4742]: I0317 11:14:08.965851 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:08Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:09 crc kubenswrapper[4742]: I0317 11:14:09.662291 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:09 crc kubenswrapper[4742]: I0317 11:14:09.662438 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:09 crc kubenswrapper[4742]: I0317 11:14:09.662553 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:09 crc kubenswrapper[4742]: E0317 11:14:09.662571 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:09 crc kubenswrapper[4742]: I0317 11:14:09.662685 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:09 crc kubenswrapper[4742]: E0317 11:14:09.663382 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:09 crc kubenswrapper[4742]: I0317 11:14:09.663495 4742 scope.go:117] "RemoveContainer" containerID="2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d" Mar 17 11:14:09 crc kubenswrapper[4742]: E0317 11:14:09.663073 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:09 crc kubenswrapper[4742]: E0317 11:14:09.663553 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.464629 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.465088 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.465099 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.465116 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.465127 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:10Z","lastTransitionTime":"2026-03-17T11:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:10 crc kubenswrapper[4742]: E0317 11:14:10.479896 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.484184 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.484231 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.484243 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.484262 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.484274 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:10Z","lastTransitionTime":"2026-03-17T11:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:10 crc kubenswrapper[4742]: E0317 11:14:10.500533 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.504780 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.504825 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.504842 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.504864 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.504875 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:10Z","lastTransitionTime":"2026-03-17T11:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.511703 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/1.log" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.515311 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerStarted","Data":"c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9"} Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.516129 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:14:10 crc kubenswrapper[4742]: E0317 11:14:10.531689 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.541159 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.541233 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.541258 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.541296 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.541322 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:10Z","lastTransitionTime":"2026-03-17T11:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.563222 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19e16e08-f79a-4053-ae9b-1712b1502658\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 11:12:06.794873 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 11:12:06.796424 1 observer_polling.go:159] Starting file observer\\\\nI0317 11:12:06.799354 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 11:12:06.800598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0317 11:12:36.358933 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0317 11:12:36.359068 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: E0317 11:14:10.566565 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.572276 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.572329 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.572342 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.572367 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.572379 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:10Z","lastTransitionTime":"2026-03-17T11:14:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.594136 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: E0317 11:14:10.596210 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: E0317 11:14:10.596322 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.613371 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.625700 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.635283 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.647442 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.657464 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.669093 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.682157 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.693210 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5929c1f-8c88-4de7-bdf8-697bcc72db2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.706582 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.722464 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.733961 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.744958 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.758409 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.771897 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.782397 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:10 crc kubenswrapper[4742]: I0317 11:14:10.805286 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:13:56Z\\\",\\\"message\\\":\\\"3:56.394297 6904 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 11:13:56.394303 6904 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:13:56.394325 6904 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 11:13:56.394332 6904 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 11:13:56.394344 6904 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0317 11:13:56.394409 6904 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:13:56.394499 6904 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 11:13:56.394538 6904 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 11:13:56.394584 6904 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 11:13:56.394608 6904 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 11:13:56.394648 6904 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 11:13:56.394630 6904 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0317 11:13:56.394695 6904 factory.go:656] Stopping watch factory\\\\nI0317 11:13:56.394714 6904 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0317 11:13:56.394717 6904 ovnkube.go:599] Stopped ovnkube\\\\nI0317 11:13:56.394700 6904 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 11:13:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:10Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.527146 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/2.log" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.528021 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/1.log" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.531248 4742 generic.go:334] "Generic (PLEG): container finished" podID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerID="c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9" exitCode=1 Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.531317 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9"} Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.531406 4742 scope.go:117] "RemoveContainer" containerID="2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.532537 4742 scope.go:117] "RemoveContainer" containerID="c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9" Mar 17 11:14:11 crc kubenswrapper[4742]: E0317 11:14:11.532808 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.569206 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.583693 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.601276 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.615886 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.634601 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19e16e08-f79a-4053-ae9b-1712b1502658\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 11:12:06.794873 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 11:12:06.796424 1 observer_polling.go:159] Starting file observer\\\\nI0317 11:12:06.799354 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 11:12:06.800598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0317 11:12:36.358933 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0317 11:12:36.359068 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.656393 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.661831 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.661933 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.661979 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:11 crc kubenswrapper[4742]: E0317 11:14:11.661995 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:11 crc kubenswrapper[4742]: E0317 11:14:11.662253 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.662281 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:11 crc kubenswrapper[4742]: E0317 11:14:11.662393 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:11 crc kubenswrapper[4742]: E0317 11:14:11.662473 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.670667 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.689465 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.709631 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.728075 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.755969 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.777755 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.794058 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.816866 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5929c1f-8c88-4de7-bdf8-697bcc72db2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.838360 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.858090 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.886737 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ed0101f2b36840ce3821eb83c8aabb050bfbb5fc1ac73e1e34a58fe74202a6d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:13:56Z\\\",\\\"message\\\":\\\"3:56.394297 6904 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0317 11:13:56.394303 6904 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:13:56.394325 6904 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0317 11:13:56.394332 6904 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 11:13:56.394344 6904 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0317 11:13:56.394409 6904 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:13:56.394499 6904 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 11:13:56.394538 6904 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 11:13:56.394584 6904 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 11:13:56.394608 6904 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 11:13:56.394648 6904 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 11:13:56.394630 6904 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0317 11:13:56.394695 6904 factory.go:656] Stopping watch factory\\\\nI0317 11:13:56.394714 6904 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0317 11:13:56.394717 6904 ovnkube.go:599] Stopped ovnkube\\\\nI0317 11:13:56.394700 6904 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0317 11:13:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"*v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658384 7101 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658631 7101 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658668 7101 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:14:10.658680 7101 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658688 7101 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:14:10.658693 7101 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:14:10.658740 7101 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658776 7101 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:14:10.658818 7101 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:14:10.659011 7101 factory.go:656] Stopping watch factory\\\\nI0317 11:14:10.659029 7101 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:14:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:11 crc kubenswrapper[4742]: I0317 11:14:11.909175 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:11Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.536217 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/2.log" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.539939 4742 scope.go:117] "RemoveContainer" containerID="c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9" Mar 17 11:14:12 crc kubenswrapper[4742]: E0317 11:14:12.540127 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.558053 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.576693 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.587344 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.598522 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.609737 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.625869 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5929c1f-8c88-4de7-bdf8-697bcc72db2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.640158 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.660508 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.678736 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.690998 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.703321 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.718592 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.751437 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"*v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658384 7101 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658631 7101 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658668 7101 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:14:10.658680 7101 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658688 7101 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:14:10.658693 7101 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:14:10.658740 7101 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658776 7101 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:14:10.658818 7101 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:14:10.659011 7101 factory.go:656] Stopping watch factory\\\\nI0317 11:14:10.659029 7101 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:14:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.760648 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.778981 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19e16e08-f79a-4053-ae9b-1712b1502658\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 11:12:06.794873 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 11:12:06.796424 1 observer_polling.go:159] Starting file observer\\\\nI0317 11:12:06.799354 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 11:12:06.800598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0317 11:12:36.358933 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0317 11:12:36.359068 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.800881 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.815995 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:12 crc kubenswrapper[4742]: I0317 11:14:12.833720 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:12Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:13 crc kubenswrapper[4742]: I0317 11:14:13.662672 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:13 crc kubenswrapper[4742]: I0317 11:14:13.662726 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:13 crc kubenswrapper[4742]: I0317 11:14:13.662824 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:13 crc kubenswrapper[4742]: I0317 11:14:13.663000 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:13 crc kubenswrapper[4742]: E0317 11:14:13.663075 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:13 crc kubenswrapper[4742]: E0317 11:14:13.663186 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:13 crc kubenswrapper[4742]: E0317 11:14:13.663422 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:13 crc kubenswrapper[4742]: E0317 11:14:13.663509 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:13 crc kubenswrapper[4742]: I0317 11:14:13.679261 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 17 11:14:13 crc kubenswrapper[4742]: E0317 11:14:13.887729 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:14:15 crc kubenswrapper[4742]: I0317 11:14:15.661900 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:15 crc kubenswrapper[4742]: I0317 11:14:15.662035 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:15 crc kubenswrapper[4742]: E0317 11:14:15.662131 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:15 crc kubenswrapper[4742]: I0317 11:14:15.661900 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:15 crc kubenswrapper[4742]: E0317 11:14:15.662257 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:15 crc kubenswrapper[4742]: E0317 11:14:15.662360 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:15 crc kubenswrapper[4742]: I0317 11:14:15.662377 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:15 crc kubenswrapper[4742]: E0317 11:14:15.662507 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:17 crc kubenswrapper[4742]: I0317 11:14:17.662883 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:17 crc kubenswrapper[4742]: I0317 11:14:17.663121 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:17 crc kubenswrapper[4742]: I0317 11:14:17.662902 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:17 crc kubenswrapper[4742]: E0317 11:14:17.663326 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:17 crc kubenswrapper[4742]: E0317 11:14:17.663121 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:17 crc kubenswrapper[4742]: E0317 11:14:17.663425 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:17 crc kubenswrapper[4742]: I0317 11:14:17.664444 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:17 crc kubenswrapper[4742]: E0317 11:14:17.664619 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.684588 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5929c1f-8c88-4de7-bdf8-697bcc72db2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.705676 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.727380 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.744822 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.760189 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.775449 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.805046 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.822655 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.853227 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"*v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658384 7101 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658631 7101 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658668 7101 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:14:10.658680 7101 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658688 7101 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:14:10.658693 7101 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:14:10.658740 7101 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658776 7101 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:14:10.658818 7101 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:14:10.659011 7101 factory.go:656] Stopping watch factory\\\\nI0317 11:14:10.659029 7101 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:14:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.871122 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19e16e08-f79a-4053-ae9b-1712b1502658\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 11:12:06.794873 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 11:12:06.796424 1 observer_polling.go:159] Starting file observer\\\\nI0317 11:12:06.799354 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 11:12:06.800598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0317 11:12:36.358933 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0317 11:12:36.359068 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: E0317 11:14:18.889324 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.906153 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.925815 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.943384 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.958437 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.970782 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b0b044-72f0-4bbf-80b2-c8a1178ad0ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef34e2c73260f5fc46fc0a526e4c1e5bd59861295b227901413b64b6d27a8a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:18 crc kubenswrapper[4742]: I0317 11:14:18.990818 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:19 crc kubenswrapper[4742]: I0317 11:14:19.001380 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:18Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:19 crc kubenswrapper[4742]: I0317 11:14:19.015298 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:19Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:19 crc kubenswrapper[4742]: I0317 11:14:19.031404 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:19Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:19 crc kubenswrapper[4742]: I0317 11:14:19.662130 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:19 crc kubenswrapper[4742]: I0317 11:14:19.662222 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:19 crc kubenswrapper[4742]: I0317 11:14:19.662222 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:19 crc kubenswrapper[4742]: E0317 11:14:19.662357 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:19 crc kubenswrapper[4742]: E0317 11:14:19.662504 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:19 crc kubenswrapper[4742]: E0317 11:14:19.662626 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:19 crc kubenswrapper[4742]: I0317 11:14:19.663174 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:19 crc kubenswrapper[4742]: E0317 11:14:19.663508 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:20 crc kubenswrapper[4742]: I0317 11:14:20.968045 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:20 crc kubenswrapper[4742]: I0317 11:14:20.968118 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:20 crc kubenswrapper[4742]: I0317 11:14:20.968136 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:20 crc kubenswrapper[4742]: I0317 11:14:20.968163 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:20 crc kubenswrapper[4742]: I0317 11:14:20.968180 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:20Z","lastTransitionTime":"2026-03-17T11:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:20 crc kubenswrapper[4742]: E0317 11:14:20.987167 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:20Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:20 crc kubenswrapper[4742]: I0317 11:14:20.992669 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:20 crc kubenswrapper[4742]: I0317 11:14:20.992732 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:20 crc kubenswrapper[4742]: I0317 11:14:20.992749 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:20 crc kubenswrapper[4742]: I0317 11:14:20.992775 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:20 crc kubenswrapper[4742]: I0317 11:14:20.992792 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:20Z","lastTransitionTime":"2026-03-17T11:14:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.010633 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:21Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.015985 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.016247 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.016405 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.016603 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.016825 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:21Z","lastTransitionTime":"2026-03-17T11:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.036406 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:21Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.040712 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.040815 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.040828 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.040873 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.040887 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:21Z","lastTransitionTime":"2026-03-17T11:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.056970 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:21Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.062229 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.062487 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.062659 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.062816 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.062999 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:21Z","lastTransitionTime":"2026-03-17T11:14:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.083951 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:21Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.084228 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.607973 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.608157 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.608186 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:25.608149694 +0000 UTC m=+228.734277482 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.608231 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.608288 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.608330 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.608347 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.608352 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.608419 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.608436 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.608455 4742 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.608511 4742 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.608586 4742 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.608514 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 11:15:25.608497514 +0000 UTC m=+228.734625302 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.608639 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:15:25.608625498 +0000 UTC m=+228.734753286 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.608658 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:15:25.608648218 +0000 UTC m=+228.734776006 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.608421 4742 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.608703 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 11:15:25.60869197 +0000 UTC m=+228.734819758 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.662228 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.662274 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.662327 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:21 crc kubenswrapper[4742]: I0317 11:14:21.662252 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.662464 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.662721 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.663186 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:21 crc kubenswrapper[4742]: E0317 11:14:21.663369 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:23 crc kubenswrapper[4742]: I0317 11:14:23.662766 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:23 crc kubenswrapper[4742]: I0317 11:14:23.662816 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:23 crc kubenswrapper[4742]: I0317 11:14:23.662840 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:23 crc kubenswrapper[4742]: E0317 11:14:23.664626 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:23 crc kubenswrapper[4742]: I0317 11:14:23.662878 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:23 crc kubenswrapper[4742]: E0317 11:14:23.664729 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:23 crc kubenswrapper[4742]: E0317 11:14:23.664881 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:23 crc kubenswrapper[4742]: E0317 11:14:23.664895 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:23 crc kubenswrapper[4742]: E0317 11:14:23.891163 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:14:25 crc kubenswrapper[4742]: I0317 11:14:25.661886 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:25 crc kubenswrapper[4742]: I0317 11:14:25.661992 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:25 crc kubenswrapper[4742]: I0317 11:14:25.662023 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:25 crc kubenswrapper[4742]: E0317 11:14:25.662109 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:25 crc kubenswrapper[4742]: E0317 11:14:25.662241 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:25 crc kubenswrapper[4742]: I0317 11:14:25.662347 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:25 crc kubenswrapper[4742]: E0317 11:14:25.662774 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:25 crc kubenswrapper[4742]: I0317 11:14:25.663253 4742 scope.go:117] "RemoveContainer" containerID="c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9" Mar 17 11:14:25 crc kubenswrapper[4742]: E0317 11:14:25.663278 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:25 crc kubenswrapper[4742]: E0317 11:14:25.663509 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" Mar 17 11:14:27 crc kubenswrapper[4742]: I0317 11:14:27.662613 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:27 crc kubenswrapper[4742]: E0317 11:14:27.662891 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:27 crc kubenswrapper[4742]: I0317 11:14:27.663370 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:27 crc kubenswrapper[4742]: I0317 11:14:27.663456 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:27 crc kubenswrapper[4742]: E0317 11:14:27.663523 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:27 crc kubenswrapper[4742]: I0317 11:14:27.663456 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:27 crc kubenswrapper[4742]: E0317 11:14:27.663645 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:27 crc kubenswrapper[4742]: E0317 11:14:27.663735 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.687579 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.709632 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.731102 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.764266 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"*v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658384 7101 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658631 7101 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658668 7101 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:14:10.658680 7101 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658688 7101 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:14:10.658693 7101 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:14:10.658740 7101 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658776 7101 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:14:10.658818 7101 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:14:10.659011 7101 factory.go:656] Stopping watch factory\\\\nI0317 11:14:10.659029 7101 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:14:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.781569 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.801970 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19e16e08-f79a-4053-ae9b-1712b1502658\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 11:12:06.794873 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 11:12:06.796424 1 observer_polling.go:159] Starting file observer\\\\nI0317 11:12:06.799354 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 11:12:06.800598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0317 11:12:36.358933 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0317 11:12:36.359068 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.841553 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.863722 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.886239 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:28 crc kubenswrapper[4742]: E0317 11:14:28.892312 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.909975 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.929292 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b0b044-72f0-4bbf-80b2-c8a1178ad0ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef34e2c73260f5fc46fc0a526e4c1e5bd59861295b227901413b64b6d27a8a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.955337 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.973856 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:28 crc kubenswrapper[4742]: I0317 11:14:28.994706 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:28Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:29 crc kubenswrapper[4742]: I0317 11:14:29.013610 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:29Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:29 crc kubenswrapper[4742]: I0317 11:14:29.033140 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5929c1f-8c88-4de7-bdf8-697bcc72db2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:29Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:29 crc kubenswrapper[4742]: I0317 11:14:29.055814 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:29Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:29 crc kubenswrapper[4742]: I0317 11:14:29.078316 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:29Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:29 crc kubenswrapper[4742]: I0317 11:14:29.096683 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:29Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:29 crc kubenswrapper[4742]: I0317 11:14:29.662813 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:29 crc kubenswrapper[4742]: E0317 11:14:29.663027 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:29 crc kubenswrapper[4742]: I0317 11:14:29.662813 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:29 crc kubenswrapper[4742]: I0317 11:14:29.662843 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:29 crc kubenswrapper[4742]: I0317 11:14:29.662836 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:29 crc kubenswrapper[4742]: E0317 11:14:29.663348 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:29 crc kubenswrapper[4742]: E0317 11:14:29.663375 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:29 crc kubenswrapper[4742]: E0317 11:14:29.663426 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:30 crc kubenswrapper[4742]: I0317 11:14:30.517725 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:30 crc kubenswrapper[4742]: E0317 11:14:30.518025 4742 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:14:30 crc kubenswrapper[4742]: E0317 11:14:30.518154 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs podName:6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14 nodeName:}" failed. No retries permitted until 2026-03-17 11:15:34.518124997 +0000 UTC m=+237.644252755 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs") pod "network-metrics-daemon-drnx8" (UID: "6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.112510 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.112567 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.112586 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.112609 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.112631 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:31Z","lastTransitionTime":"2026-03-17T11:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:31 crc kubenswrapper[4742]: E0317 11:14:31.133890 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:31Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.138786 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.138849 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.138874 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.138946 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.138973 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:31Z","lastTransitionTime":"2026-03-17T11:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:31 crc kubenswrapper[4742]: E0317 11:14:31.155859 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:31Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.161072 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.161130 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.161154 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.161183 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.161204 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:31Z","lastTransitionTime":"2026-03-17T11:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:31 crc kubenswrapper[4742]: E0317 11:14:31.179776 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:31Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.184823 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.184870 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.184886 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.184944 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.184963 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:31Z","lastTransitionTime":"2026-03-17T11:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:31 crc kubenswrapper[4742]: E0317 11:14:31.207208 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:31Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.212330 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.212377 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.212395 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.212417 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.212433 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:31Z","lastTransitionTime":"2026-03-17T11:14:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:31 crc kubenswrapper[4742]: E0317 11:14:31.236127 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:31Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:31 crc kubenswrapper[4742]: E0317 11:14:31.236454 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.662649 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.662710 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.662680 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:31 crc kubenswrapper[4742]: E0317 11:14:31.662889 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:31 crc kubenswrapper[4742]: I0317 11:14:31.662960 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:31 crc kubenswrapper[4742]: E0317 11:14:31.663117 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:31 crc kubenswrapper[4742]: E0317 11:14:31.663277 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:31 crc kubenswrapper[4742]: E0317 11:14:31.663400 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.621164 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwmfc_ff1068ee-5ebe-4575-806d-967a3b9bfb6a/kube-multus/0.log" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.621238 4742 generic.go:334] "Generic (PLEG): container finished" podID="ff1068ee-5ebe-4575-806d-967a3b9bfb6a" containerID="e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622" exitCode=1 Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.621309 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwmfc" event={"ID":"ff1068ee-5ebe-4575-806d-967a3b9bfb6a","Type":"ContainerDied","Data":"e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622"} Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.621991 4742 scope.go:117] "RemoveContainer" containerID="e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.647943 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.662659 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.662711 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.662712 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:33 crc kubenswrapper[4742]: E0317 11:14:33.662865 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.663000 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:33 crc kubenswrapper[4742]: E0317 11:14:33.663047 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:33 crc kubenswrapper[4742]: E0317 11:14:33.663186 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:33 crc kubenswrapper[4742]: E0317 11:14:33.663332 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.665246 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.685891 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.704874 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:33Z\\\",\\\"message\\\":\\\"2026-03-17T11:13:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd\\\\n2026-03-17T11:13:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd to /host/opt/cni/bin/\\\\n2026-03-17T11:13:48Z [verbose] multus-daemon started\\\\n2026-03-17T11:13:48Z [verbose] Readiness Indicator file check\\\\n2026-03-17T11:14:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.721645 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b0b044-72f0-4bbf-80b2-c8a1178ad0ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef34e2c73260f5fc46fc0a526e4c1e5bd59861295b227901413b64b6d27a8a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.741454 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.765260 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.785388 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.803111 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.822614 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5929c1f-8c88-4de7-bdf8-697bcc72db2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.844132 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.866341 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: E0317 11:14:33.893861 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.900514 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"*v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658384 7101 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658631 7101 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658668 7101 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:14:10.658680 7101 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658688 7101 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:14:10.658693 7101 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:14:10.658740 7101 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658776 7101 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:14:10.658818 7101 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:14:10.659011 7101 factory.go:656] Stopping watch factory\\\\nI0317 11:14:10.659029 7101 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:14:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.922164 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.951704 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.968558 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.982856 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:33 crc kubenswrapper[4742]: I0317 11:14:33.995884 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:33Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.008998 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19e16e08-f79a-4053-ae9b-1712b1502658\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 11:12:06.794873 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 11:12:06.796424 1 observer_polling.go:159] Starting file observer\\\\nI0317 11:12:06.799354 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 11:12:06.800598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0317 11:12:36.358933 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0317 11:12:36.359068 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.627426 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwmfc_ff1068ee-5ebe-4575-806d-967a3b9bfb6a/kube-multus/0.log" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.627543 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwmfc" event={"ID":"ff1068ee-5ebe-4575-806d-967a3b9bfb6a","Type":"ContainerStarted","Data":"1a7dfbf3da964f99f958fe0751c5fdfaf6d1c1d5938316d5fa840c4187b524fe"} Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.646015 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.670274 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5929c1f-8c88-4de7-bdf8-697bcc72db2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.690310 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.711720 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.726675 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.744010 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.761869 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.779849 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.799578 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"*v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658384 7101 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658631 7101 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658668 7101 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:14:10.658680 7101 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658688 7101 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:14:10.658693 7101 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:14:10.658740 7101 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658776 7101 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:14:10.658818 7101 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:14:10.659011 7101 factory.go:656] Stopping watch factory\\\\nI0317 11:14:10.659029 7101 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:14:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.813117 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.826737 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19e16e08-f79a-4053-ae9b-1712b1502658\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 11:12:06.794873 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 11:12:06.796424 1 observer_polling.go:159] Starting file observer\\\\nI0317 11:12:06.799354 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 11:12:06.800598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0317 11:12:36.358933 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0317 11:12:36.359068 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.858760 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.880847 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.896133 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.914890 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a7dfbf3da964f99f958fe0751c5fdfaf6d1c1d5938316d5fa840c4187b524fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:33Z\\\",\\\"message\\\":\\\"2026-03-17T11:13:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd\\\\n2026-03-17T11:13:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd to /host/opt/cni/bin/\\\\n2026-03-17T11:13:48Z [verbose] multus-daemon started\\\\n2026-03-17T11:13:48Z [verbose] Readiness Indicator file check\\\\n2026-03-17T11:14:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.929964 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b0b044-72f0-4bbf-80b2-c8a1178ad0ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef34e2c73260f5fc46fc0a526e4c1e5bd59861295b227901413b64b6d27a8a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.950209 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.962626 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:34 crc kubenswrapper[4742]: I0317 11:14:34.977102 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:34Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:35 crc kubenswrapper[4742]: I0317 11:14:35.662647 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:35 crc kubenswrapper[4742]: I0317 11:14:35.662759 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:35 crc kubenswrapper[4742]: I0317 11:14:35.662697 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:35 crc kubenswrapper[4742]: E0317 11:14:35.662877 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:35 crc kubenswrapper[4742]: I0317 11:14:35.663026 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:35 crc kubenswrapper[4742]: E0317 11:14:35.663125 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:35 crc kubenswrapper[4742]: E0317 11:14:35.663274 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:35 crc kubenswrapper[4742]: E0317 11:14:35.663399 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:37 crc kubenswrapper[4742]: I0317 11:14:37.662403 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:37 crc kubenswrapper[4742]: E0317 11:14:37.662593 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:37 crc kubenswrapper[4742]: I0317 11:14:37.662677 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:37 crc kubenswrapper[4742]: E0317 11:14:37.662778 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:37 crc kubenswrapper[4742]: I0317 11:14:37.662828 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:37 crc kubenswrapper[4742]: E0317 11:14:37.662943 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:37 crc kubenswrapper[4742]: I0317 11:14:37.662999 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:37 crc kubenswrapper[4742]: E0317 11:14:37.663078 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.663294 4742 scope.go:117] "RemoveContainer" containerID="c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.683732 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19e16e08-f79a-4053-ae9b-1712b1502658\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 11:12:06.794873 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 11:12:06.796424 1 observer_polling.go:159] Starting file observer\\\\nI0317 11:12:06.799354 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 11:12:06.800598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0317 11:12:36.358933 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0317 11:12:36.359068 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.717337 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.738721 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.758834 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.772285 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.783112 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b0b044-72f0-4bbf-80b2-c8a1178ad0ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef34e2c73260f5fc46fc0a526e4c1e5bd59861295b227901413b64b6d27a8a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.802498 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.815601 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.834990 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.865260 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a7dfbf3da964f99f958fe0751c5fdfaf6d1c1d5938316d5fa840c4187b524fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:33Z\\\",\\\"message\\\":\\\"2026-03-17T11:13:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd\\\\n2026-03-17T11:13:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd to /host/opt/cni/bin/\\\\n2026-03-17T11:13:48Z [verbose] multus-daemon started\\\\n2026-03-17T11:13:48Z [verbose] Readiness Indicator file check\\\\n2026-03-17T11:14:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.883015 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5929c1f-8c88-4de7-bdf8-697bcc72db2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: E0317 11:14:38.894756 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.897347 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.918430 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.931847 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.942376 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.961648 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.974916 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:38 crc kubenswrapper[4742]: I0317 11:14:38.990647 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:38Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.014518 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"*v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658384 7101 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658631 7101 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658668 7101 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:14:10.658680 7101 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658688 7101 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:14:10.658693 7101 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:14:10.658740 7101 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658776 7101 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:14:10.658818 7101 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:14:10.659011 7101 factory.go:656] Stopping watch factory\\\\nI0317 11:14:10.659029 7101 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:14:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.647832 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/3.log" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.648511 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/2.log" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.651432 4742 generic.go:334] "Generic (PLEG): container finished" podID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerID="80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2" exitCode=1 Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.651485 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2"} Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.651532 4742 scope.go:117] "RemoveContainer" containerID="c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.652467 4742 scope.go:117] "RemoveContainer" containerID="80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2" Mar 17 11:14:39 crc kubenswrapper[4742]: E0317 11:14:39.652650 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.661871 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.661937 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:39 crc kubenswrapper[4742]: E0317 11:14:39.661999 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.662069 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:39 crc kubenswrapper[4742]: E0317 11:14:39.662071 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:39 crc kubenswrapper[4742]: E0317 11:14:39.662128 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.662158 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:39 crc kubenswrapper[4742]: E0317 11:14:39.662216 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.664542 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b0b044-72f0-4bbf-80b2-c8a1178ad0ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef34e2c73260f5fc46fc0a526e4c1e5bd59861295b227901413b64b6d27a8a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.684363 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.694168 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.704887 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.718069 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a7dfbf3da964f99f958fe0751c5fdfaf6d1c1d5938316d5fa840c4187b524fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:33Z\\\",\\\"message\\\":\\\"2026-03-17T11:13:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd\\\\n2026-03-17T11:13:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd to /host/opt/cni/bin/\\\\n2026-03-17T11:13:48Z [verbose] multus-daemon started\\\\n2026-03-17T11:13:48Z [verbose] Readiness Indicator file check\\\\n2026-03-17T11:14:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.728643 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5929c1f-8c88-4de7-bdf8-697bcc72db2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.748595 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.768765 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.780340 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.791375 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.803748 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.817089 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.827560 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.844339 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c312f608f8b39b1a72e5959c3b4d07ec4041aaa871ccff4573145408facdc5a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:10Z\\\",\\\"message\\\":\\\"*v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658384 7101 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658631 7101 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658668 7101 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:14:10.658680 7101 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0317 11:14:10.658688 7101 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:14:10.658693 7101 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:14:10.658740 7101 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:10.658776 7101 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:14:10.658818 7101 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:14:10.659011 7101 factory.go:656] Stopping watch factory\\\\nI0317 11:14:10.659029 7101 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:14:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:39Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:39.563866 7408 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:14:39.563898 7408 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 11:14:39.563920 7408 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 11:14:39.563932 7408 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:14:39.563936 7408 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:14:39.563972 7408 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0317 11:14:39.563973 7408 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 11:14:39.563988 7408 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:14:39.563994 7408 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 11:14:39.563995 7408 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 11:14:39.563999 7408 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 11:14:39.564010 7408 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0317 11:14:39.564017 7408 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:14:39.564046 7408 factory.go:656] Stopping watch factory\\\\nI0317 11:14:39.564065 7408 ovnkube.go:599] Stopped ovnkube\\\\nI0317 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:14:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.862704 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19e16e08-f79a-4053-ae9b-1712b1502658\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 11:12:06.794873 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 11:12:06.796424 1 observer_polling.go:159] Starting file observer\\\\nI0317 11:12:06.799354 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 11:12:06.800598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0317 11:12:36.358933 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0317 11:12:36.359068 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.890817 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.907585 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.925039 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:39 crc kubenswrapper[4742]: I0317 11:14:39.939127 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:39Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:40 crc kubenswrapper[4742]: I0317 11:14:40.658232 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/3.log" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.282645 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.282742 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.282766 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.282809 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.282839 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:41Z","lastTransitionTime":"2026-03-17T11:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:41 crc kubenswrapper[4742]: E0317 11:14:41.306694 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:41Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.312790 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.312846 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.312869 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.312899 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.312971 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:41Z","lastTransitionTime":"2026-03-17T11:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:41 crc kubenswrapper[4742]: E0317 11:14:41.334043 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:41Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.339759 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.339975 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.340009 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.340086 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.340107 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:41Z","lastTransitionTime":"2026-03-17T11:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:41 crc kubenswrapper[4742]: E0317 11:14:41.361360 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:41Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.367399 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.367459 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.367477 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.367503 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.367521 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:41Z","lastTransitionTime":"2026-03-17T11:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:41 crc kubenswrapper[4742]: E0317 11:14:41.388113 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:41Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.393049 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.393087 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.393099 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.393120 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.393133 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:41Z","lastTransitionTime":"2026-03-17T11:14:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:41 crc kubenswrapper[4742]: E0317 11:14:41.410246 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:41Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:41 crc kubenswrapper[4742]: E0317 11:14:41.410400 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.663061 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.663130 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.663127 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:41 crc kubenswrapper[4742]: I0317 11:14:41.663090 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:41 crc kubenswrapper[4742]: E0317 11:14:41.663298 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:41 crc kubenswrapper[4742]: E0317 11:14:41.663607 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:41 crc kubenswrapper[4742]: E0317 11:14:41.663710 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:41 crc kubenswrapper[4742]: E0317 11:14:41.663822 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:43 crc kubenswrapper[4742]: I0317 11:14:43.662391 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:43 crc kubenswrapper[4742]: I0317 11:14:43.662457 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:43 crc kubenswrapper[4742]: I0317 11:14:43.662520 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:43 crc kubenswrapper[4742]: I0317 11:14:43.662532 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:43 crc kubenswrapper[4742]: E0317 11:14:43.662622 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:43 crc kubenswrapper[4742]: E0317 11:14:43.662843 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:43 crc kubenswrapper[4742]: E0317 11:14:43.663115 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:43 crc kubenswrapper[4742]: E0317 11:14:43.663302 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:43 crc kubenswrapper[4742]: E0317 11:14:43.896798 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:14:45 crc kubenswrapper[4742]: I0317 11:14:45.662271 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:45 crc kubenswrapper[4742]: I0317 11:14:45.662368 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:45 crc kubenswrapper[4742]: I0317 11:14:45.662378 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:45 crc kubenswrapper[4742]: E0317 11:14:45.662501 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:45 crc kubenswrapper[4742]: E0317 11:14:45.662642 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:45 crc kubenswrapper[4742]: E0317 11:14:45.662843 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:45 crc kubenswrapper[4742]: I0317 11:14:45.662923 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:45 crc kubenswrapper[4742]: E0317 11:14:45.663055 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:47 crc kubenswrapper[4742]: I0317 11:14:47.662890 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:47 crc kubenswrapper[4742]: I0317 11:14:47.663010 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:47 crc kubenswrapper[4742]: I0317 11:14:47.662928 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:47 crc kubenswrapper[4742]: E0317 11:14:47.663111 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:47 crc kubenswrapper[4742]: I0317 11:14:47.663010 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:47 crc kubenswrapper[4742]: E0317 11:14:47.663252 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:47 crc kubenswrapper[4742]: E0317 11:14:47.663404 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:47 crc kubenswrapper[4742]: E0317 11:14:47.663689 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.065328 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.066691 4742 scope.go:117] "RemoveContainer" containerID="80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2" Mar 17 11:14:48 crc kubenswrapper[4742]: E0317 11:14:48.067098 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.091895 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.116794 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.134974 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.148246 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.163446 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5929c1f-8c88-4de7-bdf8-697bcc72db2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.177577 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.196329 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.229889 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:39Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:39.563866 7408 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:14:39.563898 7408 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 11:14:39.563920 7408 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 11:14:39.563932 7408 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:14:39.563936 7408 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:14:39.563972 7408 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0317 11:14:39.563973 7408 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 11:14:39.563988 7408 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:14:39.563994 7408 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 11:14:39.563995 7408 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 11:14:39.563999 7408 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 11:14:39.564010 7408 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0317 11:14:39.564017 7408 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:14:39.564046 7408 factory.go:656] Stopping watch factory\\\\nI0317 11:14:39.564065 7408 ovnkube.go:599] Stopped ovnkube\\\\nI0317 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:14:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.251562 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.283453 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.304142 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.321402 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.337137 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.352769 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19e16e08-f79a-4053-ae9b-1712b1502658\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 11:12:06.794873 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 11:12:06.796424 1 observer_polling.go:159] Starting file observer\\\\nI0317 11:12:06.799354 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 11:12:06.800598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0317 11:12:36.358933 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0317 11:12:36.359068 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.370675 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.381842 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.394118 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.413767 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a7dfbf3da964f99f958fe0751c5fdfaf6d1c1d5938316d5fa840c4187b524fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:33Z\\\",\\\"message\\\":\\\"2026-03-17T11:13:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd\\\\n2026-03-17T11:13:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd to /host/opt/cni/bin/\\\\n2026-03-17T11:13:48Z [verbose] multus-daemon started\\\\n2026-03-17T11:13:48Z [verbose] Readiness Indicator file check\\\\n2026-03-17T11:14:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.426952 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b0b044-72f0-4bbf-80b2-c8a1178ad0ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef34e2c73260f5fc46fc0a526e4c1e5bd59861295b227901413b64b6d27a8a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.684115 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.708894 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.728624 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.746867 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.766710 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5929c1f-8c88-4de7-bdf8-697bcc72db2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.787014 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.805548 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.836330 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:39Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:39.563866 7408 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:14:39.563898 7408 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 11:14:39.563920 7408 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 11:14:39.563932 7408 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:14:39.563936 7408 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:14:39.563972 7408 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0317 11:14:39.563973 7408 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 11:14:39.563988 7408 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:14:39.563994 7408 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 11:14:39.563995 7408 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 11:14:39.563999 7408 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 11:14:39.564010 7408 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0317 11:14:39.564017 7408 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:14:39.564046 7408 factory.go:656] Stopping watch factory\\\\nI0317 11:14:39.564065 7408 ovnkube.go:599] Stopped ovnkube\\\\nI0317 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:14:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.858269 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.895039 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: E0317 11:14:48.897421 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.926779 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.946807 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.963886 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:48 crc kubenswrapper[4742]: I0317 11:14:48.992025 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19e16e08-f79a-4053-ae9b-1712b1502658\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 11:12:06.794873 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 11:12:06.796424 1 observer_polling.go:159] Starting file observer\\\\nI0317 11:12:06.799354 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 11:12:06.800598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0317 11:12:36.358933 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0317 11:12:36.359068 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:48Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:49 crc kubenswrapper[4742]: I0317 11:14:49.013462 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:49Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:49 crc kubenswrapper[4742]: I0317 11:14:49.027104 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:49Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:49 crc kubenswrapper[4742]: I0317 11:14:49.047427 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:49Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:49 crc kubenswrapper[4742]: I0317 11:14:49.072349 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a7dfbf3da964f99f958fe0751c5fdfaf6d1c1d5938316d5fa840c4187b524fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:33Z\\\",\\\"message\\\":\\\"2026-03-17T11:13:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd\\\\n2026-03-17T11:13:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd to /host/opt/cni/bin/\\\\n2026-03-17T11:13:48Z [verbose] multus-daemon started\\\\n2026-03-17T11:13:48Z [verbose] Readiness Indicator file check\\\\n2026-03-17T11:14:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:49Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:49 crc kubenswrapper[4742]: I0317 11:14:49.089965 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b0b044-72f0-4bbf-80b2-c8a1178ad0ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef34e2c73260f5fc46fc0a526e4c1e5bd59861295b227901413b64b6d27a8a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:49Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:49 crc kubenswrapper[4742]: I0317 11:14:49.662410 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:49 crc kubenswrapper[4742]: I0317 11:14:49.662432 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:49 crc kubenswrapper[4742]: I0317 11:14:49.662544 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:49 crc kubenswrapper[4742]: I0317 11:14:49.663109 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:49 crc kubenswrapper[4742]: E0317 11:14:49.663343 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:49 crc kubenswrapper[4742]: E0317 11:14:49.663454 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:49 crc kubenswrapper[4742]: E0317 11:14:49.663538 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:49 crc kubenswrapper[4742]: E0317 11:14:49.663610 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.504151 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.504217 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.504238 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.504263 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.504282 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:51Z","lastTransitionTime":"2026-03-17T11:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:51 crc kubenswrapper[4742]: E0317 11:14:51.526307 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:51Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.531962 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.532034 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.532060 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.532092 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.532116 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:51Z","lastTransitionTime":"2026-03-17T11:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:51 crc kubenswrapper[4742]: E0317 11:14:51.553711 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:51Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.559013 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.559058 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.559072 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.559096 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.559114 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:51Z","lastTransitionTime":"2026-03-17T11:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:51 crc kubenswrapper[4742]: E0317 11:14:51.577811 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:51Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.582979 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.583044 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.583062 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.583085 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.583103 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:51Z","lastTransitionTime":"2026-03-17T11:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:51 crc kubenswrapper[4742]: E0317 11:14:51.603037 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:51Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.609410 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.609453 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.609470 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.609494 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.609512 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:14:51Z","lastTransitionTime":"2026-03-17T11:14:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:14:51 crc kubenswrapper[4742]: E0317 11:14:51.629237 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:51Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:51 crc kubenswrapper[4742]: E0317 11:14:51.629833 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.662397 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.662402 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.662447 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:51 crc kubenswrapper[4742]: E0317 11:14:51.663062 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:51 crc kubenswrapper[4742]: I0317 11:14:51.662596 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:51 crc kubenswrapper[4742]: E0317 11:14:51.663221 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:51 crc kubenswrapper[4742]: E0317 11:14:51.663391 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:51 crc kubenswrapper[4742]: E0317 11:14:51.663750 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:53 crc kubenswrapper[4742]: I0317 11:14:53.662620 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:53 crc kubenswrapper[4742]: I0317 11:14:53.662671 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:53 crc kubenswrapper[4742]: E0317 11:14:53.662754 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:53 crc kubenswrapper[4742]: I0317 11:14:53.662971 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:53 crc kubenswrapper[4742]: I0317 11:14:53.662995 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:53 crc kubenswrapper[4742]: E0317 11:14:53.663040 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:53 crc kubenswrapper[4742]: E0317 11:14:53.663393 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:53 crc kubenswrapper[4742]: E0317 11:14:53.663898 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:53 crc kubenswrapper[4742]: E0317 11:14:53.898971 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:14:55 crc kubenswrapper[4742]: I0317 11:14:55.662427 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:55 crc kubenswrapper[4742]: I0317 11:14:55.662490 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:55 crc kubenswrapper[4742]: I0317 11:14:55.662491 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:55 crc kubenswrapper[4742]: I0317 11:14:55.662514 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:55 crc kubenswrapper[4742]: E0317 11:14:55.662633 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:55 crc kubenswrapper[4742]: E0317 11:14:55.662779 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:55 crc kubenswrapper[4742]: E0317 11:14:55.662868 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:55 crc kubenswrapper[4742]: E0317 11:14:55.662977 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:57 crc kubenswrapper[4742]: I0317 11:14:57.662381 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:57 crc kubenswrapper[4742]: I0317 11:14:57.662381 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:57 crc kubenswrapper[4742]: I0317 11:14:57.662580 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:57 crc kubenswrapper[4742]: I0317 11:14:57.662681 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:57 crc kubenswrapper[4742]: E0317 11:14:57.663607 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:57 crc kubenswrapper[4742]: E0317 11:14:57.663754 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:57 crc kubenswrapper[4742]: E0317 11:14:57.663837 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:57 crc kubenswrapper[4742]: E0317 11:14:57.663987 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.680581 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5929c1f-8c88-4de7-bdf8-697bcc72db2a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7a132551d101e2b563c4c67711d9016aa93f490c249da6528d1c0699559bda6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18a72eb81a971abc3f010dee5c6b08f3e4489f2b2a736565a539686a8c595f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1979945fade0ed959d214aacf4dca66954ed81718bbcebea222648ec5d32d5d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07a0e8cd3c5743163b7875202cb0cfdeffc993c190199f54fa9a66eea0d174c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.700954 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://561936c6e2dc0a7aae282f5cb3e1f6c102bdb3796821d1393a468cea52934635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3beefc5b6701872174f7bec64407b038ca4aedd40e16ff75d29166e9f79283ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.724450 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0932050-dced-4c05-b9d2-d8db1db0dceb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b20b3274e1f5a5f10a74041504e71c3b1437a111d204443993ac763a66f7602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd3bde2da9c1b9b04bfad18ced3abb647ec74bb21a9c94ce850d81a25afe5585\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9370beec9a1ca72c8eddf72af9ee5ae6abf9f058a0b1f399631da87218a67280\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2ad68eb35b19e4e3bd8efd9b920d62902114f5750d724cf77e06b0c8063dc06\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://900ec86aa51092987f9bf41fd91007fb60b51223f51a3191002889d825d984d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cbd1b752796a3bc6aae17e862f89176688f967ed9a8ff9756e16318742a0f4c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dbd38157b030f3623a20f0108c0bb5c3cbe25916132b74540b02935507a2398\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r2qj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-hcxv8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.738654 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8c53ad4-b584-48be-8055-a928c8a0178f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a27a378eb360f1494f244da18a3fd46b7cb2e5b6af7b49d5f8017c7824ff6646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4def351a4e1da693d9f941da5e0258be6bc8f09698398dd36dbe17d86cb187cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5kmch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qv2v9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.753431 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drnx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2rtzs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:26Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drnx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.768409 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2415fe15a45509078e29a751a53d737b0415f0a1f83d1ed59530d90bff066074\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.788636 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.806399 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fc8f4fdf6ffac5af6a48de953d31a8194aa356ec8b4b35db260b22c49bc9402\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.838124 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d021cdee-f700-4a5f-a62e-be4acbb8c62e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:39Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0317 11:14:39.563866 7408 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0317 11:14:39.563898 7408 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0317 11:14:39.563920 7408 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0317 11:14:39.563932 7408 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0317 11:14:39.563936 7408 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0317 11:14:39.563972 7408 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0317 11:14:39.563973 7408 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0317 11:14:39.563988 7408 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0317 11:14:39.563994 7408 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0317 11:14:39.563995 7408 handler.go:208] Removed *v1.Node event handler 7\\\\nI0317 11:14:39.563999 7408 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0317 11:14:39.564010 7408 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0317 11:14:39.564017 7408 handler.go:208] Removed *v1.Node event handler 2\\\\nI0317 11:14:39.564046 7408 factory.go:656] Stopping watch factory\\\\nI0317 11:14:39.564065 7408 ovnkube.go:599] Stopped ovnkube\\\\nI0317 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:14:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:13:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkjp8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwfsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.857830 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19e16e08-f79a-4053-ae9b-1712b1502658\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce55c9fe552db57aed7315321391c7967cf58577562e2bc07bf2299a9c984277\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://110c9bba6b8b4cf72126da139a3aec9a347e394aa77f92cad234e88786a28223\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:36Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0317 11:12:06.794873 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0317 11:12:06.796424 1 observer_polling.go:159] Starting file observer\\\\nI0317 11:12:06.799354 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0317 11:12:06.800598 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0317 11:12:36.358933 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0317 11:12:36.359068 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:06Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:12:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95f37054e36beb567082e022834ff266550a43e6a912dc8a13ff56c92ff83dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9086f75851d2392fa76a578b475d57eef4270c45babea46075a09f0dbef154\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.888806 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eac54340-f51e-4218-8cd2-6764b883cd7d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:12:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1cde7b1fcdf65227693adca8b48ba3587ed5e93ead09c3c445556bdc153d7879\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9fbb48f681383d89393332aa3af7de44ba6b1a42b4964afface813ab37ced47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://008627eaa5e88659d266c2e721b05c77f0c14f9f3e16e58f53c6b3416801cf00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4adf29d09ae22f6fb19a042fb9442812304cb84d32033d5974213230cecf917d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02362365a6bb3b2d5b98854ed7fe50acac789e1388643b880f35f09102095f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eac3e1e0d0ef29b826d0f50fd908102865386893c5418b85635f6dec71df04c8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d136e480100a648794f349fad80789a04835f44682bc9eca4147cc85882eefe5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d708db163b575341cec1361dd9cba09b72194f7690646376a554da9fd938ea0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: E0317 11:14:58.900378 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.925964 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.945146 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.961738 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-hv2p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad7af928-88e1-468c-9471-8e7902a4a6ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://078e6b99f370dcccb55371d3fdc0b73886aa8f8b3270d24fe8e785e91280863e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92vc8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:19Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-hv2p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.977402 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18b0b044-72f0-4bbf-80b2-c8a1178ad0ef\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef34e2c73260f5fc46fc0a526e4c1e5bd59861295b227901413b64b6d27a8a74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4406dc9f23a9f00e25e49376ca24f9349c352ed493edbecf85a1277c8237b3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:58 crc kubenswrapper[4742]: I0317 11:14:58.999701 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0631e65b-dd02-40a7-8d35-2e4c66b70cd0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:11:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-17T11:12:31Z\\\",\\\"message\\\":\\\"W0317 11:12:30.176957 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0317 11:12:30.177574 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773745950 cert, and key in /tmp/serving-cert-1974336890/serving-signer.crt, /tmp/serving-cert-1974336890/serving-signer.key\\\\nI0317 11:12:30.673951 1 observer_polling.go:159] Starting file observer\\\\nW0317 11:12:30.684270 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0317 11:12:30.684442 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0317 11:12:30.685237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1974336890/tls.crt::/tmp/serving-cert-1974336890/tls.key\\\\\\\"\\\\nF0317 11:12:31.200337 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:12:30Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:11:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-17T11:11:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-17T11:11:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:11:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:58Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:59 crc kubenswrapper[4742]: I0317 11:14:59.013809 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-kwrj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa31fa5e-119d-4392-b5c6-8f4a488e64af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0ab07ac4f5638bc7e5d0c98674200bc8d4b81d66ac80589f90dd9312ef6d5c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w4nh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-kwrj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:59Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:59 crc kubenswrapper[4742]: I0317 11:14:59.027974 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5e11ad39-38bb-4b70-9cac-ce078b37f882\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62cbec598f014cad58d9d62ea2e5fc89a19aa4569c4f35686d4d49b1084ecd1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:13:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vmpzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5jxxw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:59Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:59 crc kubenswrapper[4742]: I0317 11:14:59.047682 4742 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xwmfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff1068ee-5ebe-4575-806d-967a3b9bfb6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:13:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-17T11:14:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a7dfbf3da964f99f958fe0751c5fdfaf6d1c1d5938316d5fa840c4187b524fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-17T11:14:33Z\\\",\\\"message\\\":\\\"2026-03-17T11:13:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd\\\\n2026-03-17T11:13:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3cd2678-ac86-4d9a-90ef-23d7358e40cd to /host/opt/cni/bin/\\\\n2026-03-17T11:13:48Z [verbose] multus-daemon started\\\\n2026-03-17T11:13:48Z [verbose] Readiness Indicator file check\\\\n2026-03-17T11:14:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-17T11:13:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-17T11:14:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4w98f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-17T11:13:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xwmfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:14:59Z is after 2025-08-24T17:21:41Z" Mar 17 11:14:59 crc kubenswrapper[4742]: I0317 11:14:59.662446 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:14:59 crc kubenswrapper[4742]: I0317 11:14:59.662570 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:14:59 crc kubenswrapper[4742]: I0317 11:14:59.662572 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:14:59 crc kubenswrapper[4742]: I0317 11:14:59.662468 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:14:59 crc kubenswrapper[4742]: E0317 11:14:59.662691 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:14:59 crc kubenswrapper[4742]: E0317 11:14:59.662808 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:14:59 crc kubenswrapper[4742]: E0317 11:14:59.663056 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:14:59 crc kubenswrapper[4742]: E0317 11:14:59.663352 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:14:59 crc kubenswrapper[4742]: I0317 11:14:59.664620 4742 scope.go:117] "RemoveContainer" containerID="80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2" Mar 17 11:14:59 crc kubenswrapper[4742]: E0317 11:14:59.664980 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.657984 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.658033 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.658043 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.658061 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.658073 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:15:01Z","lastTransitionTime":"2026-03-17T11:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.662153 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.662268 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.662268 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.662277 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:01 crc kubenswrapper[4742]: E0317 11:15:01.662369 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:01 crc kubenswrapper[4742]: E0317 11:15:01.662510 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:01 crc kubenswrapper[4742]: E0317 11:15:01.662538 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:01 crc kubenswrapper[4742]: E0317 11:15:01.662718 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:01 crc kubenswrapper[4742]: E0317 11:15:01.678678 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:15:01Z is after 2025-08-24T17:21:41Z" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.684676 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.684773 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.684809 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.684843 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.684864 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:15:01Z","lastTransitionTime":"2026-03-17T11:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:15:01 crc kubenswrapper[4742]: E0317 11:15:01.709403 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:15:01Z is after 2025-08-24T17:21:41Z" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.714964 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.715031 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.715051 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.715078 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.715096 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:15:01Z","lastTransitionTime":"2026-03-17T11:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:15:01 crc kubenswrapper[4742]: E0317 11:15:01.732690 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:15:01Z is after 2025-08-24T17:21:41Z" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.738602 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.738706 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.738763 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.738789 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.738844 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:15:01Z","lastTransitionTime":"2026-03-17T11:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:15:01 crc kubenswrapper[4742]: E0317 11:15:01.763532 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:15:01Z is after 2025-08-24T17:21:41Z" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.770086 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.770208 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.770226 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.770254 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:15:01 crc kubenswrapper[4742]: I0317 11:15:01.770271 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:15:01Z","lastTransitionTime":"2026-03-17T11:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:15:01 crc kubenswrapper[4742]: E0317 11:15:01.788822 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-17T11:15:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a949f061-2bf4-4376-98c3-0527ac24d2e9\\\",\\\"systemUUID\\\":\\\"6693cb74-dd53-4aae-b4e6-7786830660f7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-17T11:15:01Z is after 2025-08-24T17:21:41Z" Mar 17 11:15:01 crc kubenswrapper[4742]: E0317 11:15:01.789082 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:15:03 crc kubenswrapper[4742]: I0317 11:15:03.662443 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:03 crc kubenswrapper[4742]: E0317 11:15:03.662582 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:03 crc kubenswrapper[4742]: I0317 11:15:03.662589 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:03 crc kubenswrapper[4742]: I0317 11:15:03.662646 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:03 crc kubenswrapper[4742]: I0317 11:15:03.662612 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:03 crc kubenswrapper[4742]: E0317 11:15:03.663463 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:03 crc kubenswrapper[4742]: E0317 11:15:03.663950 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:03 crc kubenswrapper[4742]: E0317 11:15:03.664189 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:03 crc kubenswrapper[4742]: E0317 11:15:03.902055 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:15:05 crc kubenswrapper[4742]: I0317 11:15:05.662297 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:05 crc kubenswrapper[4742]: I0317 11:15:05.662406 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:05 crc kubenswrapper[4742]: I0317 11:15:05.662546 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:05 crc kubenswrapper[4742]: E0317 11:15:05.662538 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:05 crc kubenswrapper[4742]: I0317 11:15:05.662596 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:05 crc kubenswrapper[4742]: E0317 11:15:05.662669 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:05 crc kubenswrapper[4742]: E0317 11:15:05.662791 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:05 crc kubenswrapper[4742]: E0317 11:15:05.663020 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:07 crc kubenswrapper[4742]: I0317 11:15:07.662884 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:07 crc kubenswrapper[4742]: I0317 11:15:07.662886 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:07 crc kubenswrapper[4742]: E0317 11:15:07.663167 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:07 crc kubenswrapper[4742]: I0317 11:15:07.663219 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:07 crc kubenswrapper[4742]: I0317 11:15:07.663233 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:07 crc kubenswrapper[4742]: E0317 11:15:07.663556 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:07 crc kubenswrapper[4742]: E0317 11:15:07.663828 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:07 crc kubenswrapper[4742]: E0317 11:15:07.664001 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:08 crc kubenswrapper[4742]: I0317 11:15:08.720704 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=67.720675827 podStartE2EDuration="1m7.720675827s" podCreationTimestamp="2026-03-17 11:14:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:08.697001403 +0000 UTC m=+211.823129221" watchObservedRunningTime="2026-03-17 11:15:08.720675827 +0000 UTC m=+211.846803625" Mar 17 11:15:08 crc kubenswrapper[4742]: I0317 11:15:08.744938 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hcxv8" podStartSLOduration=144.744878935 podStartE2EDuration="2m24.744878935s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:08.744866005 +0000 UTC m=+211.870993843" watchObservedRunningTime="2026-03-17 11:15:08.744878935 +0000 UTC m=+211.871006733" Mar 17 11:15:08 crc kubenswrapper[4742]: I0317 11:15:08.768756 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qv2v9" podStartSLOduration=144.768737184 podStartE2EDuration="2m24.768737184s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:08.768554729 +0000 UTC m=+211.894682527" watchObservedRunningTime="2026-03-17 11:15:08.768737184 +0000 UTC m=+211.894864952" Mar 17 11:15:08 crc kubenswrapper[4742]: E0317 11:15:08.902706 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:15:08 crc kubenswrapper[4742]: I0317 11:15:08.926348 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=65.926257503 podStartE2EDuration="1m5.926257503s" podCreationTimestamp="2026-03-17 11:14:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:08.926012145 +0000 UTC m=+212.052139943" watchObservedRunningTime="2026-03-17 11:15:08.926257503 +0000 UTC m=+212.052385301" Mar 17 11:15:09 crc kubenswrapper[4742]: I0317 11:15:09.005988 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=112.005972104 podStartE2EDuration="1m52.005972104s" podCreationTimestamp="2026-03-17 11:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:08.991170446 +0000 UTC m=+212.117298204" watchObservedRunningTime="2026-03-17 11:15:09.005972104 +0000 UTC m=+212.132099862" Mar 17 11:15:09 crc kubenswrapper[4742]: I0317 11:15:09.032514 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hv2p6" podStartSLOduration=146.032499019 podStartE2EDuration="2m26.032499019s" podCreationTimestamp="2026-03-17 11:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:09.032326074 +0000 UTC m=+212.158453852" watchObservedRunningTime="2026-03-17 11:15:09.032499019 +0000 UTC m=+212.158626777" Mar 17 11:15:09 crc kubenswrapper[4742]: I0317 11:15:09.047656 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=56.047634656 podStartE2EDuration="56.047634656s" podCreationTimestamp="2026-03-17 11:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:09.046290578 +0000 UTC m=+212.172418336" watchObservedRunningTime="2026-03-17 11:15:09.047634656 +0000 UTC m=+212.173762424" Mar 17 11:15:09 crc kubenswrapper[4742]: I0317 11:15:09.065219 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=112.065200224 podStartE2EDuration="1m52.065200224s" podCreationTimestamp="2026-03-17 11:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:09.065194734 +0000 UTC m=+212.191322502" watchObservedRunningTime="2026-03-17 11:15:09.065200224 +0000 UTC m=+212.191327992" Mar 17 11:15:09 crc kubenswrapper[4742]: I0317 11:15:09.078311 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kwrj5" podStartSLOduration=146.078296192 podStartE2EDuration="2m26.078296192s" podCreationTimestamp="2026-03-17 11:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:09.077184599 +0000 UTC m=+212.203312367" watchObservedRunningTime="2026-03-17 11:15:09.078296192 +0000 UTC m=+212.204423950" Mar 17 11:15:09 crc kubenswrapper[4742]: I0317 11:15:09.090159 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podStartSLOduration=146.090144304 podStartE2EDuration="2m26.090144304s" podCreationTimestamp="2026-03-17 11:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:09.089827075 +0000 UTC m=+212.215954853" watchObservedRunningTime="2026-03-17 11:15:09.090144304 +0000 UTC m=+212.216272062" Mar 17 11:15:09 crc kubenswrapper[4742]: I0317 11:15:09.107211 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xwmfc" podStartSLOduration=145.107195497 podStartE2EDuration="2m25.107195497s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:09.106571148 +0000 UTC m=+212.232698916" watchObservedRunningTime="2026-03-17 11:15:09.107195497 +0000 UTC m=+212.233323255" Mar 17 11:15:09 crc kubenswrapper[4742]: I0317 11:15:09.662102 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:09 crc kubenswrapper[4742]: I0317 11:15:09.662163 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:09 crc kubenswrapper[4742]: I0317 11:15:09.662182 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:09 crc kubenswrapper[4742]: I0317 11:15:09.662227 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:09 crc kubenswrapper[4742]: E0317 11:15:09.662433 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:09 crc kubenswrapper[4742]: E0317 11:15:09.662556 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:09 crc kubenswrapper[4742]: E0317 11:15:09.662643 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:09 crc kubenswrapper[4742]: E0317 11:15:09.662712 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:11 crc kubenswrapper[4742]: I0317 11:15:11.661992 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:11 crc kubenswrapper[4742]: I0317 11:15:11.662093 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:11 crc kubenswrapper[4742]: I0317 11:15:11.662248 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:11 crc kubenswrapper[4742]: I0317 11:15:11.662310 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:11 crc kubenswrapper[4742]: E0317 11:15:11.662444 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:11 crc kubenswrapper[4742]: E0317 11:15:11.662631 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:11 crc kubenswrapper[4742]: E0317 11:15:11.662831 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:11 crc kubenswrapper[4742]: E0317 11:15:11.663023 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.140118 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.140207 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.140235 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.140270 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.140294 4742 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-17T11:15:12Z","lastTransitionTime":"2026-03-17T11:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.215548 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn"] Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.216232 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.221135 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.221284 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.221694 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.222255 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.246716 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.246863 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.246896 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.246967 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.246995 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.348507 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.348630 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.348639 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.348742 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.348774 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.348830 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.348859 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.350123 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.359185 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.382345 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4534e629-2cc7-4443-bf52-d4ca35ffb8a9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lhrzn\" (UID: \"4534e629-2cc7-4443-bf52-d4ca35ffb8a9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.539459 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" Mar 17 11:15:12 crc kubenswrapper[4742]: W0317 11:15:12.559063 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4534e629_2cc7_4443_bf52_d4ca35ffb8a9.slice/crio-67c3fc1833ed826182f5e2518cd963ba04db48620825669cdd6be9d198eda5ec WatchSource:0}: Error finding container 67c3fc1833ed826182f5e2518cd963ba04db48620825669cdd6be9d198eda5ec: Status 404 returned error can't find the container with id 67c3fc1833ed826182f5e2518cd963ba04db48620825669cdd6be9d198eda5ec Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.711985 4742 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.723285 4742 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.781022 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" event={"ID":"4534e629-2cc7-4443-bf52-d4ca35ffb8a9","Type":"ContainerStarted","Data":"889aebd0408b51d2d5046c31e7666e9ae3233a5b052cd184556a3aff2d7ccf6c"} Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.781086 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" event={"ID":"4534e629-2cc7-4443-bf52-d4ca35ffb8a9","Type":"ContainerStarted","Data":"67c3fc1833ed826182f5e2518cd963ba04db48620825669cdd6be9d198eda5ec"} Mar 17 11:15:12 crc kubenswrapper[4742]: I0317 11:15:12.806473 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lhrzn" podStartSLOduration=149.806442176 podStartE2EDuration="2m29.806442176s" podCreationTimestamp="2026-03-17 11:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:12.805138648 +0000 UTC m=+215.931266406" watchObservedRunningTime="2026-03-17 11:15:12.806442176 +0000 UTC m=+215.932569934" Mar 17 11:15:13 crc kubenswrapper[4742]: I0317 11:15:13.661833 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:13 crc kubenswrapper[4742]: I0317 11:15:13.662018 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:13 crc kubenswrapper[4742]: E0317 11:15:13.662044 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:13 crc kubenswrapper[4742]: I0317 11:15:13.662110 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:13 crc kubenswrapper[4742]: E0317 11:15:13.662374 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:13 crc kubenswrapper[4742]: E0317 11:15:13.662467 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:13 crc kubenswrapper[4742]: I0317 11:15:13.662115 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:13 crc kubenswrapper[4742]: E0317 11:15:13.662572 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:13 crc kubenswrapper[4742]: E0317 11:15:13.904259 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:15:14 crc kubenswrapper[4742]: I0317 11:15:14.664240 4742 scope.go:117] "RemoveContainer" containerID="80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2" Mar 17 11:15:14 crc kubenswrapper[4742]: E0317 11:15:14.665424 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwfsr_openshift-ovn-kubernetes(d021cdee-f700-4a5f-a62e-be4acbb8c62e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" Mar 17 11:15:15 crc kubenswrapper[4742]: I0317 11:15:15.662183 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:15 crc kubenswrapper[4742]: I0317 11:15:15.662241 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:15 crc kubenswrapper[4742]: I0317 11:15:15.662284 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:15 crc kubenswrapper[4742]: I0317 11:15:15.662216 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:15 crc kubenswrapper[4742]: E0317 11:15:15.662379 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:15 crc kubenswrapper[4742]: E0317 11:15:15.662486 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:15 crc kubenswrapper[4742]: E0317 11:15:15.662629 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:15 crc kubenswrapper[4742]: E0317 11:15:15.662739 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:17 crc kubenswrapper[4742]: I0317 11:15:17.662774 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:17 crc kubenswrapper[4742]: I0317 11:15:17.662827 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:17 crc kubenswrapper[4742]: I0317 11:15:17.662785 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:17 crc kubenswrapper[4742]: E0317 11:15:17.662970 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:17 crc kubenswrapper[4742]: I0317 11:15:17.663057 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:17 crc kubenswrapper[4742]: E0317 11:15:17.663094 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:17 crc kubenswrapper[4742]: E0317 11:15:17.663279 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:17 crc kubenswrapper[4742]: E0317 11:15:17.663405 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:18 crc kubenswrapper[4742]: E0317 11:15:18.905498 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:15:19 crc kubenswrapper[4742]: I0317 11:15:19.662244 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:19 crc kubenswrapper[4742]: I0317 11:15:19.662297 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:19 crc kubenswrapper[4742]: I0317 11:15:19.662297 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:19 crc kubenswrapper[4742]: I0317 11:15:19.662249 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:19 crc kubenswrapper[4742]: E0317 11:15:19.662447 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:19 crc kubenswrapper[4742]: E0317 11:15:19.662560 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:19 crc kubenswrapper[4742]: E0317 11:15:19.662656 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:19 crc kubenswrapper[4742]: E0317 11:15:19.662804 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:19 crc kubenswrapper[4742]: I0317 11:15:19.807885 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwmfc_ff1068ee-5ebe-4575-806d-967a3b9bfb6a/kube-multus/1.log" Mar 17 11:15:19 crc kubenswrapper[4742]: I0317 11:15:19.808897 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwmfc_ff1068ee-5ebe-4575-806d-967a3b9bfb6a/kube-multus/0.log" Mar 17 11:15:19 crc kubenswrapper[4742]: I0317 11:15:19.808999 4742 generic.go:334] "Generic (PLEG): container finished" podID="ff1068ee-5ebe-4575-806d-967a3b9bfb6a" containerID="1a7dfbf3da964f99f958fe0751c5fdfaf6d1c1d5938316d5fa840c4187b524fe" exitCode=1 Mar 17 11:15:19 crc kubenswrapper[4742]: I0317 11:15:19.809043 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwmfc" event={"ID":"ff1068ee-5ebe-4575-806d-967a3b9bfb6a","Type":"ContainerDied","Data":"1a7dfbf3da964f99f958fe0751c5fdfaf6d1c1d5938316d5fa840c4187b524fe"} Mar 17 11:15:19 crc kubenswrapper[4742]: I0317 11:15:19.809093 4742 scope.go:117] "RemoveContainer" containerID="e41fa621c724c4e7363ede5397a18097136340ebf475c60c5436e30313a0a622" Mar 17 11:15:19 crc kubenswrapper[4742]: I0317 11:15:19.809719 4742 scope.go:117] "RemoveContainer" containerID="1a7dfbf3da964f99f958fe0751c5fdfaf6d1c1d5938316d5fa840c4187b524fe" Mar 17 11:15:19 crc kubenswrapper[4742]: E0317 11:15:19.810046 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xwmfc_openshift-multus(ff1068ee-5ebe-4575-806d-967a3b9bfb6a)\"" pod="openshift-multus/multus-xwmfc" podUID="ff1068ee-5ebe-4575-806d-967a3b9bfb6a" Mar 17 11:15:20 crc kubenswrapper[4742]: I0317 11:15:20.817170 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwmfc_ff1068ee-5ebe-4575-806d-967a3b9bfb6a/kube-multus/1.log" Mar 17 11:15:21 crc kubenswrapper[4742]: I0317 11:15:21.662314 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:21 crc kubenswrapper[4742]: I0317 11:15:21.662389 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:21 crc kubenswrapper[4742]: I0317 11:15:21.662473 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:21 crc kubenswrapper[4742]: I0317 11:15:21.662482 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:21 crc kubenswrapper[4742]: E0317 11:15:21.662461 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:21 crc kubenswrapper[4742]: E0317 11:15:21.662622 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:21 crc kubenswrapper[4742]: E0317 11:15:21.662723 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:21 crc kubenswrapper[4742]: E0317 11:15:21.662842 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:23 crc kubenswrapper[4742]: I0317 11:15:23.662113 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:23 crc kubenswrapper[4742]: E0317 11:15:23.662992 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:23 crc kubenswrapper[4742]: I0317 11:15:23.662210 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:23 crc kubenswrapper[4742]: E0317 11:15:23.663242 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:23 crc kubenswrapper[4742]: I0317 11:15:23.662212 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:23 crc kubenswrapper[4742]: E0317 11:15:23.663429 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:23 crc kubenswrapper[4742]: I0317 11:15:23.662290 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:23 crc kubenswrapper[4742]: E0317 11:15:23.663628 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:23 crc kubenswrapper[4742]: E0317 11:15:23.907336 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:15:25 crc kubenswrapper[4742]: I0317 11:15:25.621746 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:25 crc kubenswrapper[4742]: I0317 11:15:25.621942 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.621962 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:17:27.621935147 +0000 UTC m=+350.748062905 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:25 crc kubenswrapper[4742]: I0317 11:15:25.622011 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:25 crc kubenswrapper[4742]: I0317 11:15:25.622088 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.622120 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:15:25 crc kubenswrapper[4742]: I0317 11:15:25.622133 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.622146 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.622166 4742 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.622230 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 11:17:27.622208615 +0000 UTC m=+350.748336403 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.622250 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.622264 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.622274 4742 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.622309 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 11:17:27.622302288 +0000 UTC m=+350.748430046 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.622326 4742 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.622452 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:17:27.622423851 +0000 UTC m=+350.748551639 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.622346 4742 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.622597 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:17:27.622560175 +0000 UTC m=+350.748687963 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 17 11:15:25 crc kubenswrapper[4742]: I0317 11:15:25.662178 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:25 crc kubenswrapper[4742]: I0317 11:15:25.662212 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:25 crc kubenswrapper[4742]: I0317 11:15:25.662207 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.662390 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:25 crc kubenswrapper[4742]: I0317 11:15:25.662465 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.662657 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.662745 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:25 crc kubenswrapper[4742]: E0317 11:15:25.662860 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:27 crc kubenswrapper[4742]: I0317 11:15:27.662569 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:27 crc kubenswrapper[4742]: I0317 11:15:27.662624 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:27 crc kubenswrapper[4742]: I0317 11:15:27.662633 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:27 crc kubenswrapper[4742]: E0317 11:15:27.662697 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:27 crc kubenswrapper[4742]: I0317 11:15:27.662799 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:27 crc kubenswrapper[4742]: E0317 11:15:27.662972 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:27 crc kubenswrapper[4742]: E0317 11:15:27.663221 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:27 crc kubenswrapper[4742]: E0317 11:15:27.663432 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:28 crc kubenswrapper[4742]: E0317 11:15:28.908432 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:15:29 crc kubenswrapper[4742]: I0317 11:15:29.662549 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:29 crc kubenswrapper[4742]: E0317 11:15:29.662748 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:29 crc kubenswrapper[4742]: I0317 11:15:29.663042 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:29 crc kubenswrapper[4742]: I0317 11:15:29.663109 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:29 crc kubenswrapper[4742]: I0317 11:15:29.663191 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:29 crc kubenswrapper[4742]: E0317 11:15:29.663224 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:29 crc kubenswrapper[4742]: E0317 11:15:29.663273 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:29 crc kubenswrapper[4742]: E0317 11:15:29.663496 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:29 crc kubenswrapper[4742]: I0317 11:15:29.664489 4742 scope.go:117] "RemoveContainer" containerID="80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2" Mar 17 11:15:29 crc kubenswrapper[4742]: I0317 11:15:29.964211 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/3.log" Mar 17 11:15:29 crc kubenswrapper[4742]: I0317 11:15:29.968712 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerStarted","Data":"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a"} Mar 17 11:15:29 crc kubenswrapper[4742]: I0317 11:15:29.969274 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:15:30 crc kubenswrapper[4742]: I0317 11:15:30.576433 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podStartSLOduration=166.576408471 podStartE2EDuration="2m46.576408471s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:30.001540413 +0000 UTC m=+233.127668241" watchObservedRunningTime="2026-03-17 11:15:30.576408471 +0000 UTC m=+233.702536269" Mar 17 11:15:30 crc kubenswrapper[4742]: I0317 11:15:30.578113 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-drnx8"] Mar 17 11:15:30 crc kubenswrapper[4742]: I0317 11:15:30.578264 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:30 crc kubenswrapper[4742]: E0317 11:15:30.578423 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:31 crc kubenswrapper[4742]: I0317 11:15:31.662924 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:31 crc kubenswrapper[4742]: I0317 11:15:31.663018 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:31 crc kubenswrapper[4742]: E0317 11:15:31.663130 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:31 crc kubenswrapper[4742]: I0317 11:15:31.663198 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:31 crc kubenswrapper[4742]: I0317 11:15:31.663201 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:31 crc kubenswrapper[4742]: E0317 11:15:31.663339 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:31 crc kubenswrapper[4742]: E0317 11:15:31.663623 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:31 crc kubenswrapper[4742]: E0317 11:15:31.663653 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:33 crc kubenswrapper[4742]: I0317 11:15:33.661893 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:33 crc kubenswrapper[4742]: I0317 11:15:33.662052 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:33 crc kubenswrapper[4742]: I0317 11:15:33.661893 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:33 crc kubenswrapper[4742]: E0317 11:15:33.662102 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:33 crc kubenswrapper[4742]: I0317 11:15:33.662312 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:33 crc kubenswrapper[4742]: E0317 11:15:33.662524 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:33 crc kubenswrapper[4742]: E0317 11:15:33.662577 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:33 crc kubenswrapper[4742]: E0317 11:15:33.662690 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:33 crc kubenswrapper[4742]: E0317 11:15:33.910168 4742 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 17 11:15:34 crc kubenswrapper[4742]: I0317 11:15:34.527223 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:34 crc kubenswrapper[4742]: E0317 11:15:34.527508 4742 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:15:34 crc kubenswrapper[4742]: E0317 11:15:34.527837 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs podName:6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14 nodeName:}" failed. No retries permitted until 2026-03-17 11:17:36.527807882 +0000 UTC m=+359.653935640 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs") pod "network-metrics-daemon-drnx8" (UID: "6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 17 11:15:34 crc kubenswrapper[4742]: I0317 11:15:34.662580 4742 scope.go:117] "RemoveContainer" containerID="1a7dfbf3da964f99f958fe0751c5fdfaf6d1c1d5938316d5fa840c4187b524fe" Mar 17 11:15:34 crc kubenswrapper[4742]: I0317 11:15:34.992107 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwmfc_ff1068ee-5ebe-4575-806d-967a3b9bfb6a/kube-multus/1.log" Mar 17 11:15:34 crc kubenswrapper[4742]: I0317 11:15:34.992626 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwmfc" event={"ID":"ff1068ee-5ebe-4575-806d-967a3b9bfb6a","Type":"ContainerStarted","Data":"49f006810bcc95db05a54979c00d1df941ae6ad018abc40980080ba41668f2fa"} Mar 17 11:15:35 crc kubenswrapper[4742]: I0317 11:15:35.662833 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:35 crc kubenswrapper[4742]: I0317 11:15:35.662849 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:35 crc kubenswrapper[4742]: I0317 11:15:35.662871 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:35 crc kubenswrapper[4742]: E0317 11:15:35.664153 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:35 crc kubenswrapper[4742]: I0317 11:15:35.662957 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:35 crc kubenswrapper[4742]: E0317 11:15:35.664278 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:35 crc kubenswrapper[4742]: E0317 11:15:35.664627 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:35 crc kubenswrapper[4742]: E0317 11:15:35.664626 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:37 crc kubenswrapper[4742]: I0317 11:15:37.662151 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:37 crc kubenswrapper[4742]: E0317 11:15:37.662388 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:15:37 crc kubenswrapper[4742]: I0317 11:15:37.662743 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:37 crc kubenswrapper[4742]: E0317 11:15:37.662875 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drnx8" podUID="6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14" Mar 17 11:15:37 crc kubenswrapper[4742]: I0317 11:15:37.663169 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:37 crc kubenswrapper[4742]: E0317 11:15:37.663287 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:15:37 crc kubenswrapper[4742]: I0317 11:15:37.663536 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:37 crc kubenswrapper[4742]: E0317 11:15:37.663680 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:15:39 crc kubenswrapper[4742]: I0317 11:15:39.662152 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:15:39 crc kubenswrapper[4742]: I0317 11:15:39.662210 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:15:39 crc kubenswrapper[4742]: I0317 11:15:39.662179 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:15:39 crc kubenswrapper[4742]: I0317 11:15:39.662287 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:15:39 crc kubenswrapper[4742]: I0317 11:15:39.667008 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 17 11:15:39 crc kubenswrapper[4742]: I0317 11:15:39.667164 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 17 11:15:39 crc kubenswrapper[4742]: I0317 11:15:39.667547 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 17 11:15:39 crc kubenswrapper[4742]: I0317 11:15:39.667273 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 17 11:15:39 crc kubenswrapper[4742]: I0317 11:15:39.667809 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 17 11:15:39 crc kubenswrapper[4742]: I0317 11:15:39.668124 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.590886 4742 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.653438 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9zclv"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.654301 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9zclv" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.660638 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-52v8r"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.661351 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.686087 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.686552 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.688318 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wdrq6"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.694400 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.695051 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.695484 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.696541 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.696725 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-s5z9r"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.696800 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.697122 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.697160 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-s5z9r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.697435 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.697499 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-lfdfp"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.697686 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.698134 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.698679 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.698870 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.700484 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.700725 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.700871 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.701039 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.701098 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.701180 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.701623 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zk827"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.702017 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.702363 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.702458 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.702820 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.707661 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.709351 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.709533 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.709653 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.709774 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.709944 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.710043 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.710133 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.710194 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.710219 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.710313 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.710397 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.710521 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.710610 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.710624 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.710892 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.711769 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.711845 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.717193 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.717740 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.718322 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.718482 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.718946 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-spkdx"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.719226 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tmn6g"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.718494 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.719552 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.719595 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.719892 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.720655 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bc2zs"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.720974 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.721020 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.721025 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.721045 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.722382 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.722730 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.723014 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.723227 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qtcq5"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.718898 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.723254 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.730067 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.733363 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9lz9n"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.733841 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.734334 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.734408 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-hwx7f"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.734577 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.734697 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.736698 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.738041 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-s5z9r"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.738339 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.738991 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.746599 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.746898 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.747244 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.747486 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.747693 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.748068 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.774721 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.775536 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.775776 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.776286 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.776383 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.776574 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.776621 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.776679 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6n4cr"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.776706 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.776900 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.777026 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.777165 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.777266 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.777310 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.777389 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.777438 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.777492 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.777599 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.777606 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.777997 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.778134 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.778179 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.777655 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.778330 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.779469 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.792959 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.794383 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.795543 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.795920 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.796146 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.796594 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wdrq6"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.796644 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.796813 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.806187 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.809065 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.809152 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.809235 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.809372 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.809534 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.809550 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9fkmh"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.809805 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.809987 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9fkmh" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.810168 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.810228 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.810300 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.810436 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.810477 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.810957 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.812009 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hg7ln"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.812421 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hg7ln" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.814627 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.814936 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.815088 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.815209 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.815542 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.815659 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.816003 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.816613 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.816992 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.817181 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.817194 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.817274 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.817301 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.817386 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.817475 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.817517 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.817480 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.817596 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.817595 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.817736 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.817791 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.817999 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818647 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/954d6c46-40a1-4d36-b42f-5ef67aba794a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-76tzr\" (UID: \"954d6c46-40a1-4d36-b42f-5ef67aba794a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818677 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-serving-cert\") pod \"route-controller-manager-6576b87f9c-4msdd\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818705 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2r26\" (UniqueName: \"kubernetes.io/projected/0de428d9-1755-4c28-8c6e-cbb115aef7c7-kube-api-access-r2r26\") pod \"downloads-7954f5f757-s5z9r\" (UID: \"0de428d9-1755-4c28-8c6e-cbb115aef7c7\") " pod="openshift-console/downloads-7954f5f757-s5z9r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818721 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e415e748-23a1-4fdd-80ba-38308aaa4926-metrics-tls\") pod \"dns-operator-744455d44c-9zclv\" (UID: \"e415e748-23a1-4fdd-80ba-38308aaa4926\") " pod="openshift-dns-operator/dns-operator-744455d44c-9zclv" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818736 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50e9e286-63d8-4081-b085-ad6aa123b560-audit-dir\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818751 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn2r4\" (UniqueName: \"kubernetes.io/projected/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-kube-api-access-xn2r4\") pod \"route-controller-manager-6576b87f9c-4msdd\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818766 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-client-ca\") pod \"route-controller-manager-6576b87f9c-4msdd\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818782 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxq5r\" (UniqueName: \"kubernetes.io/projected/df58c683-d42a-46c4-9e5e-9b717ddc7956-kube-api-access-rxq5r\") pod \"openshift-config-operator-7777fb866f-8tf9v\" (UID: \"df58c683-d42a-46c4-9e5e-9b717ddc7956\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818798 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd74653-8f7b-446d-8ded-b8816cf3f46a-service-ca-bundle\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818813 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz56r\" (UniqueName: \"kubernetes.io/projected/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-kube-api-access-nz56r\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818838 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/497b1f19-025b-4b65-b062-b4a94eec3cfc-serving-cert\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818853 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwhqm\" (UniqueName: \"kubernetes.io/projected/af535295-2114-4275-b62f-3bee0eb830b5-kube-api-access-pwhqm\") pod \"openshift-apiserver-operator-796bbdcf4f-5zpx5\" (UID: \"af535295-2114-4275-b62f-3bee0eb830b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818868 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af535295-2114-4275-b62f-3bee0eb830b5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5zpx5\" (UID: \"af535295-2114-4275-b62f-3bee0eb830b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818883 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6pr5\" (UniqueName: \"kubernetes.io/projected/50e9e286-63d8-4081-b085-ad6aa123b560-kube-api-access-r6pr5\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818899 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqvmz\" (UniqueName: \"kubernetes.io/projected/2afdd196-9364-4f22-a98b-27f4d8602196-kube-api-access-fqvmz\") pod \"cluster-samples-operator-665b6dd947-8pwp5\" (UID: \"2afdd196-9364-4f22-a98b-27f4d8602196\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818940 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-serving-cert\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818955 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-oauth-config\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818970 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/954d6c46-40a1-4d36-b42f-5ef67aba794a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-76tzr\" (UID: \"954d6c46-40a1-4d36-b42f-5ef67aba794a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.818986 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjslf\" (UniqueName: \"kubernetes.io/projected/8cd74653-8f7b-446d-8ded-b8816cf3f46a-kube-api-access-rjslf\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819044 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50e9e286-63d8-4081-b085-ad6aa123b560-node-pullsecrets\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819076 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50e9e286-63d8-4081-b085-ad6aa123b560-serving-cert\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819107 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-client-ca\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819135 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkt5p\" (UniqueName: \"kubernetes.io/projected/e415e748-23a1-4fdd-80ba-38308aaa4926-kube-api-access-kkt5p\") pod \"dns-operator-744455d44c-9zclv\" (UID: \"e415e748-23a1-4fdd-80ba-38308aaa4926\") " pod="openshift-dns-operator/dns-operator-744455d44c-9zclv" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819155 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-etcd-serving-ca\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819169 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-trusted-ca-bundle\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819188 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cd74653-8f7b-446d-8ded-b8816cf3f46a-serving-cert\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819206 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-service-ca\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819230 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-trusted-ca-bundle\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819245 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-oauth-serving-cert\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819266 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-image-import-ca\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819280 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819296 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-config\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819361 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwvbf\" (UniqueName: \"kubernetes.io/projected/954d6c46-40a1-4d36-b42f-5ef67aba794a-kube-api-access-hwvbf\") pod \"openshift-controller-manager-operator-756b6f6bc6-76tzr\" (UID: \"954d6c46-40a1-4d36-b42f-5ef67aba794a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819405 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df58c683-d42a-46c4-9e5e-9b717ddc7956-serving-cert\") pod \"openshift-config-operator-7777fb866f-8tf9v\" (UID: \"df58c683-d42a-46c4-9e5e-9b717ddc7956\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819424 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-config\") pod \"route-controller-manager-6576b87f9c-4msdd\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819453 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd74653-8f7b-446d-8ded-b8816cf3f46a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819472 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/50e9e286-63d8-4081-b085-ad6aa123b560-encryption-config\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819495 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd74653-8f7b-446d-8ded-b8816cf3f46a-config\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819512 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/50e9e286-63d8-4081-b085-ad6aa123b560-etcd-client\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819537 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pk7z\" (UniqueName: \"kubernetes.io/projected/497b1f19-025b-4b65-b062-b4a94eec3cfc-kube-api-access-7pk7z\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819552 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2afdd196-9364-4f22-a98b-27f4d8602196-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8pwp5\" (UID: \"2afdd196-9364-4f22-a98b-27f4d8602196\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819567 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-config\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819583 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-audit\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819607 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af535295-2114-4275-b62f-3bee0eb830b5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5zpx5\" (UID: \"af535295-2114-4275-b62f-3bee0eb830b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819624 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/df58c683-d42a-46c4-9e5e-9b717ddc7956-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8tf9v\" (UID: \"df58c683-d42a-46c4-9e5e-9b717ddc7956\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.819662 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-config\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.821000 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.830328 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.831748 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.832409 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9zclv"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.832429 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.832501 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.833279 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.834067 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.834120 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.834620 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.836958 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-52v8r"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.838226 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.838334 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.839269 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.843362 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.844765 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nsx27"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.845213 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.845456 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.845559 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562434-wtx87"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.845702 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.845743 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.846177 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562434-wtx87" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.873210 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.873540 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z2csl"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.874554 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.875943 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.876196 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.876224 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.878693 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.950730 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.951412 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.952194 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.952825 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwhqm\" (UniqueName: \"kubernetes.io/projected/af535295-2114-4275-b62f-3bee0eb830b5-kube-api-access-pwhqm\") pod \"openshift-apiserver-operator-796bbdcf4f-5zpx5\" (UID: \"af535295-2114-4275-b62f-3bee0eb830b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.952851 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh5bk\" (UniqueName: \"kubernetes.io/projected/a63c2414-b309-48e5-95f2-ab1b45577b92-kube-api-access-sh5bk\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.952867 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2800a131-02e6-49f1-9385-6065c4b4216e-stats-auth\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.952887 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af535295-2114-4275-b62f-3bee0eb830b5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5zpx5\" (UID: \"af535295-2114-4275-b62f-3bee0eb830b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.952925 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a63c2414-b309-48e5-95f2-ab1b45577b92-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.952941 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a63c2414-b309-48e5-95f2-ab1b45577b92-audit-dir\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.952957 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6pr5\" (UniqueName: \"kubernetes.io/projected/50e9e286-63d8-4081-b085-ad6aa123b560-kube-api-access-r6pr5\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.952974 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqvmz\" (UniqueName: \"kubernetes.io/projected/2afdd196-9364-4f22-a98b-27f4d8602196-kube-api-access-fqvmz\") pod \"cluster-samples-operator-665b6dd947-8pwp5\" (UID: \"2afdd196-9364-4f22-a98b-27f4d8602196\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.952989 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a63c2414-b309-48e5-95f2-ab1b45577b92-serving-cert\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953004 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a63c2414-b309-48e5-95f2-ab1b45577b92-encryption-config\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953019 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-serving-cert\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953034 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m8ct\" (UniqueName: \"kubernetes.io/projected/2800a131-02e6-49f1-9385-6065c4b4216e-kube-api-access-4m8ct\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953051 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2800a131-02e6-49f1-9385-6065c4b4216e-default-certificate\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953069 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-oauth-config\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953085 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/954d6c46-40a1-4d36-b42f-5ef67aba794a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-76tzr\" (UID: \"954d6c46-40a1-4d36-b42f-5ef67aba794a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953102 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjslf\" (UniqueName: \"kubernetes.io/projected/8cd74653-8f7b-446d-8ded-b8816cf3f46a-kube-api-access-rjslf\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953119 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f33a63f1-688a-46eb-a32f-5259fa969528-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6n4cr\" (UID: \"f33a63f1-688a-46eb-a32f-5259fa969528\") " pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953136 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2800a131-02e6-49f1-9385-6065c4b4216e-metrics-certs\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953152 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50e9e286-63d8-4081-b085-ad6aa123b560-node-pullsecrets\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953169 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50e9e286-63d8-4081-b085-ad6aa123b560-serving-cert\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953186 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-client-ca\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953202 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkt5p\" (UniqueName: \"kubernetes.io/projected/e415e748-23a1-4fdd-80ba-38308aaa4926-kube-api-access-kkt5p\") pod \"dns-operator-744455d44c-9zclv\" (UID: \"e415e748-23a1-4fdd-80ba-38308aaa4926\") " pod="openshift-dns-operator/dns-operator-744455d44c-9zclv" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953216 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-etcd-serving-ca\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953231 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-trusted-ca-bundle\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953248 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cd74653-8f7b-446d-8ded-b8816cf3f46a-serving-cert\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953264 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-service-ca\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953279 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa95c069-97da-45bf-ac92-c80160bd8648-serving-cert\") pod \"console-operator-58897d9998-tmn6g\" (UID: \"fa95c069-97da-45bf-ac92-c80160bd8648\") " pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953302 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-trusted-ca-bundle\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953318 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-oauth-serving-cert\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953333 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-image-import-ca\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953348 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953365 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f33a63f1-688a-46eb-a32f-5259fa969528-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6n4cr\" (UID: \"f33a63f1-688a-46eb-a32f-5259fa969528\") " pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953381 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-config\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953395 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa95c069-97da-45bf-ac92-c80160bd8648-trusted-ca\") pod \"console-operator-58897d9998-tmn6g\" (UID: \"fa95c069-97da-45bf-ac92-c80160bd8648\") " pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953410 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7lcv\" (UniqueName: \"kubernetes.io/projected/f33a63f1-688a-46eb-a32f-5259fa969528-kube-api-access-v7lcv\") pod \"marketplace-operator-79b997595-6n4cr\" (UID: \"f33a63f1-688a-46eb-a32f-5259fa969528\") " pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953433 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a63c2414-b309-48e5-95f2-ab1b45577b92-etcd-client\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953450 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwvbf\" (UniqueName: \"kubernetes.io/projected/954d6c46-40a1-4d36-b42f-5ef67aba794a-kube-api-access-hwvbf\") pod \"openshift-controller-manager-operator-756b6f6bc6-76tzr\" (UID: \"954d6c46-40a1-4d36-b42f-5ef67aba794a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953465 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2800a131-02e6-49f1-9385-6065c4b4216e-service-ca-bundle\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953488 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df58c683-d42a-46c4-9e5e-9b717ddc7956-serving-cert\") pod \"openshift-config-operator-7777fb866f-8tf9v\" (UID: \"df58c683-d42a-46c4-9e5e-9b717ddc7956\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953504 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd74653-8f7b-446d-8ded-b8816cf3f46a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953520 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/50e9e286-63d8-4081-b085-ad6aa123b560-encryption-config\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953536 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-config\") pod \"route-controller-manager-6576b87f9c-4msdd\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953552 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd74653-8f7b-446d-8ded-b8816cf3f46a-config\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953566 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/50e9e286-63d8-4081-b085-ad6aa123b560-etcd-client\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953581 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pk7z\" (UniqueName: \"kubernetes.io/projected/497b1f19-025b-4b65-b062-b4a94eec3cfc-kube-api-access-7pk7z\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953596 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2afdd196-9364-4f22-a98b-27f4d8602196-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8pwp5\" (UID: \"2afdd196-9364-4f22-a98b-27f4d8602196\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953610 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a63c2414-b309-48e5-95f2-ab1b45577b92-audit-policies\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953625 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af535295-2114-4275-b62f-3bee0eb830b5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5zpx5\" (UID: \"af535295-2114-4275-b62f-3bee0eb830b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953643 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/df58c683-d42a-46c4-9e5e-9b717ddc7956-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8tf9v\" (UID: \"df58c683-d42a-46c4-9e5e-9b717ddc7956\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953660 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-config\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953675 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-audit\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953692 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-config\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953715 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/954d6c46-40a1-4d36-b42f-5ef67aba794a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-76tzr\" (UID: \"954d6c46-40a1-4d36-b42f-5ef67aba794a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953732 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-serving-cert\") pod \"route-controller-manager-6576b87f9c-4msdd\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953748 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scqnc\" (UniqueName: \"kubernetes.io/projected/361582e0-97ed-4927-b83f-642592572dac-kube-api-access-scqnc\") pod \"multus-admission-controller-857f4d67dd-9fkmh\" (UID: \"361582e0-97ed-4927-b83f-642592572dac\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9fkmh" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953766 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2r26\" (UniqueName: \"kubernetes.io/projected/0de428d9-1755-4c28-8c6e-cbb115aef7c7-kube-api-access-r2r26\") pod \"downloads-7954f5f757-s5z9r\" (UID: \"0de428d9-1755-4c28-8c6e-cbb115aef7c7\") " pod="openshift-console/downloads-7954f5f757-s5z9r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953782 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa95c069-97da-45bf-ac92-c80160bd8648-config\") pod \"console-operator-58897d9998-tmn6g\" (UID: \"fa95c069-97da-45bf-ac92-c80160bd8648\") " pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953798 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e415e748-23a1-4fdd-80ba-38308aaa4926-metrics-tls\") pod \"dns-operator-744455d44c-9zclv\" (UID: \"e415e748-23a1-4fdd-80ba-38308aaa4926\") " pod="openshift-dns-operator/dns-operator-744455d44c-9zclv" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953813 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50e9e286-63d8-4081-b085-ad6aa123b560-audit-dir\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953828 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn2r4\" (UniqueName: \"kubernetes.io/projected/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-kube-api-access-xn2r4\") pod \"route-controller-manager-6576b87f9c-4msdd\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953842 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-client-ca\") pod \"route-controller-manager-6576b87f9c-4msdd\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953858 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brbcz\" (UniqueName: \"kubernetes.io/projected/fa95c069-97da-45bf-ac92-c80160bd8648-kube-api-access-brbcz\") pod \"console-operator-58897d9998-tmn6g\" (UID: \"fa95c069-97da-45bf-ac92-c80160bd8648\") " pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953874 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxq5r\" (UniqueName: \"kubernetes.io/projected/df58c683-d42a-46c4-9e5e-9b717ddc7956-kube-api-access-rxq5r\") pod \"openshift-config-operator-7777fb866f-8tf9v\" (UID: \"df58c683-d42a-46c4-9e5e-9b717ddc7956\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953889 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a63c2414-b309-48e5-95f2-ab1b45577b92-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953920 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd74653-8f7b-446d-8ded-b8816cf3f46a-service-ca-bundle\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953937 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz56r\" (UniqueName: \"kubernetes.io/projected/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-kube-api-access-nz56r\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953961 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/497b1f19-025b-4b65-b062-b4a94eec3cfc-serving-cert\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953970 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.954429 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-w4g9d"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.954644 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af535295-2114-4275-b62f-3bee0eb830b5-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5zpx5\" (UID: \"af535295-2114-4275-b62f-3bee0eb830b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.954722 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lfdfp"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.954794 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w4g9d" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.955695 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.955852 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.957273 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.958785 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/954d6c46-40a1-4d36-b42f-5ef67aba794a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-76tzr\" (UID: \"954d6c46-40a1-4d36-b42f-5ef67aba794a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.958795 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-service-ca\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.953976 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/361582e0-97ed-4927-b83f-642592572dac-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9fkmh\" (UID: \"361582e0-97ed-4927-b83f-642592572dac\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9fkmh" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.959707 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-trusted-ca-bundle\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.960130 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd74653-8f7b-446d-8ded-b8816cf3f46a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.960746 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df58c683-d42a-46c4-9e5e-9b717ddc7956-serving-cert\") pod \"openshift-config-operator-7777fb866f-8tf9v\" (UID: \"df58c683-d42a-46c4-9e5e-9b717ddc7956\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.960768 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.960938 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-image-import-ca\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.961228 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cd74653-8f7b-446d-8ded-b8816cf3f46a-config\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.961353 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-oauth-serving-cert\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.961357 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-config\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.961466 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tmn6g"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.961492 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7khn5"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.961819 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-client-ca\") pod \"route-controller-manager-6576b87f9c-4msdd\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.961968 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cd74653-8f7b-446d-8ded-b8816cf3f46a-service-ca-bundle\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.961982 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-serving-cert\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.962018 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.962086 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7khn5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.962316 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-config\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.962468 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/df58c683-d42a-46c4-9e5e-9b717ddc7956-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8tf9v\" (UID: \"df58c683-d42a-46c4-9e5e-9b717ddc7956\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.962583 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/50e9e286-63d8-4081-b085-ad6aa123b560-audit-dir\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.963180 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-config\") pod \"route-controller-manager-6576b87f9c-4msdd\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.963305 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2afdd196-9364-4f22-a98b-27f4d8602196-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8pwp5\" (UID: \"2afdd196-9364-4f22-a98b-27f4d8602196\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.963575 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-etcd-serving-ca\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.964317 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-trusted-ca-bundle\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.964345 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bc2zs"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.965728 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af535295-2114-4275-b62f-3bee0eb830b5-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5zpx5\" (UID: \"af535295-2114-4275-b62f-3bee0eb830b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.966018 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.967110 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.967269 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-config\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.967604 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/50e9e286-63d8-4081-b085-ad6aa123b560-node-pullsecrets\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.968053 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/954d6c46-40a1-4d36-b42f-5ef67aba794a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-76tzr\" (UID: \"954d6c46-40a1-4d36-b42f-5ef67aba794a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.968330 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-client-ca\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.968864 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/50e9e286-63d8-4081-b085-ad6aa123b560-audit\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.969159 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/50e9e286-63d8-4081-b085-ad6aa123b560-encryption-config\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.969303 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-oauth-config\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.969637 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50e9e286-63d8-4081-b085-ad6aa123b560-serving-cert\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.969676 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.971420 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-serving-cert\") pod \"route-controller-manager-6576b87f9c-4msdd\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.972486 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-spkdx"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.974571 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.974635 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zk827"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.974656 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/497b1f19-025b-4b65-b062-b4a94eec3cfc-serving-cert\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.975007 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e415e748-23a1-4fdd-80ba-38308aaa4926-metrics-tls\") pod \"dns-operator-744455d44c-9zclv\" (UID: \"e415e748-23a1-4fdd-80ba-38308aaa4926\") " pod="openshift-dns-operator/dns-operator-744455d44c-9zclv" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.975302 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cd74653-8f7b-446d-8ded-b8816cf3f46a-serving-cert\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.976229 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.976404 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.977961 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.986644 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/50e9e286-63d8-4081-b085-ad6aa123b560-etcd-client\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.992734 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.994594 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qtcq5"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.995935 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9lz9n"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.996358 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.997024 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hg7ln"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.997857 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb"] Mar 17 11:15:42 crc kubenswrapper[4742]: I0317 11:15:42.999248 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nsx27"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.000091 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.001661 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.002881 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.004023 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.005367 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cpjwx"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.006172 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-58scf"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.006706 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-58scf" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.006935 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.007372 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562434-wtx87"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.009126 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.010785 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6n4cr"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.012027 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.013484 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.015923 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.016240 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9fkmh"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.017876 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.019250 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.021700 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cpjwx"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.022691 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w4g9d"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.024049 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z2csl"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.025085 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-58scf"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.026207 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg"] Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.036223 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.056734 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.060625 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m8ct\" (UniqueName: \"kubernetes.io/projected/2800a131-02e6-49f1-9385-6065c4b4216e-kube-api-access-4m8ct\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.060679 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2800a131-02e6-49f1-9385-6065c4b4216e-default-certificate\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.060724 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f33a63f1-688a-46eb-a32f-5259fa969528-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6n4cr\" (UID: \"f33a63f1-688a-46eb-a32f-5259fa969528\") " pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.060757 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2800a131-02e6-49f1-9385-6065c4b4216e-metrics-certs\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.060799 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa95c069-97da-45bf-ac92-c80160bd8648-serving-cert\") pod \"console-operator-58897d9998-tmn6g\" (UID: \"fa95c069-97da-45bf-ac92-c80160bd8648\") " pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.060844 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f33a63f1-688a-46eb-a32f-5259fa969528-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6n4cr\" (UID: \"f33a63f1-688a-46eb-a32f-5259fa969528\") " pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.060871 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa95c069-97da-45bf-ac92-c80160bd8648-trusted-ca\") pod \"console-operator-58897d9998-tmn6g\" (UID: \"fa95c069-97da-45bf-ac92-c80160bd8648\") " pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061023 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7lcv\" (UniqueName: \"kubernetes.io/projected/f33a63f1-688a-46eb-a32f-5259fa969528-kube-api-access-v7lcv\") pod \"marketplace-operator-79b997595-6n4cr\" (UID: \"f33a63f1-688a-46eb-a32f-5259fa969528\") " pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061087 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a63c2414-b309-48e5-95f2-ab1b45577b92-etcd-client\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061137 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2800a131-02e6-49f1-9385-6065c4b4216e-service-ca-bundle\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061202 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a63c2414-b309-48e5-95f2-ab1b45577b92-audit-policies\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061260 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scqnc\" (UniqueName: \"kubernetes.io/projected/361582e0-97ed-4927-b83f-642592572dac-kube-api-access-scqnc\") pod \"multus-admission-controller-857f4d67dd-9fkmh\" (UID: \"361582e0-97ed-4927-b83f-642592572dac\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9fkmh" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061299 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa95c069-97da-45bf-ac92-c80160bd8648-config\") pod \"console-operator-58897d9998-tmn6g\" (UID: \"fa95c069-97da-45bf-ac92-c80160bd8648\") " pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061337 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brbcz\" (UniqueName: \"kubernetes.io/projected/fa95c069-97da-45bf-ac92-c80160bd8648-kube-api-access-brbcz\") pod \"console-operator-58897d9998-tmn6g\" (UID: \"fa95c069-97da-45bf-ac92-c80160bd8648\") " pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061387 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a63c2414-b309-48e5-95f2-ab1b45577b92-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061436 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/361582e0-97ed-4927-b83f-642592572dac-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9fkmh\" (UID: \"361582e0-97ed-4927-b83f-642592572dac\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9fkmh" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061485 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh5bk\" (UniqueName: \"kubernetes.io/projected/a63c2414-b309-48e5-95f2-ab1b45577b92-kube-api-access-sh5bk\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061518 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2800a131-02e6-49f1-9385-6065c4b4216e-stats-auth\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061547 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a63c2414-b309-48e5-95f2-ab1b45577b92-audit-dir\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061590 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a63c2414-b309-48e5-95f2-ab1b45577b92-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061633 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a63c2414-b309-48e5-95f2-ab1b45577b92-serving-cert\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.061657 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a63c2414-b309-48e5-95f2-ab1b45577b92-encryption-config\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.062503 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a63c2414-b309-48e5-95f2-ab1b45577b92-audit-policies\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.062947 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a63c2414-b309-48e5-95f2-ab1b45577b92-audit-dir\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.062990 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a63c2414-b309-48e5-95f2-ab1b45577b92-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.063010 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa95c069-97da-45bf-ac92-c80160bd8648-config\") pod \"console-operator-58897d9998-tmn6g\" (UID: \"fa95c069-97da-45bf-ac92-c80160bd8648\") " pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.063243 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a63c2414-b309-48e5-95f2-ab1b45577b92-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.063295 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa95c069-97da-45bf-ac92-c80160bd8648-trusted-ca\") pod \"console-operator-58897d9998-tmn6g\" (UID: \"fa95c069-97da-45bf-ac92-c80160bd8648\") " pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.065925 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa95c069-97da-45bf-ac92-c80160bd8648-serving-cert\") pod \"console-operator-58897d9998-tmn6g\" (UID: \"fa95c069-97da-45bf-ac92-c80160bd8648\") " pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.066965 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a63c2414-b309-48e5-95f2-ab1b45577b92-etcd-client\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.067583 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a63c2414-b309-48e5-95f2-ab1b45577b92-serving-cert\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.076714 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.077864 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a63c2414-b309-48e5-95f2-ab1b45577b92-encryption-config\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.096475 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.106098 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2800a131-02e6-49f1-9385-6065c4b4216e-default-certificate\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.117214 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.127209 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2800a131-02e6-49f1-9385-6065c4b4216e-stats-auth\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.136767 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.157261 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.176557 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.186386 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2800a131-02e6-49f1-9385-6065c4b4216e-metrics-certs\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.196188 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.204210 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2800a131-02e6-49f1-9385-6065c4b4216e-service-ca-bundle\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.216156 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.239014 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.256899 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.276571 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.309446 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.316962 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.345466 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.356584 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.377959 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.397437 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.417753 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.438644 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.457740 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.477187 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.498084 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.517975 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.557117 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.566791 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f33a63f1-688a-46eb-a32f-5259fa969528-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6n4cr\" (UID: \"f33a63f1-688a-46eb-a32f-5259fa969528\") " pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.577975 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.608264 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.615068 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f33a63f1-688a-46eb-a32f-5259fa969528-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6n4cr\" (UID: \"f33a63f1-688a-46eb-a32f-5259fa969528\") " pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.617442 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.638571 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.658753 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.678191 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.696744 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.717826 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.737363 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.757296 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.778000 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.788006 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/361582e0-97ed-4927-b83f-642592572dac-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9fkmh\" (UID: \"361582e0-97ed-4927-b83f-642592572dac\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9fkmh" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.797414 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.815740 4742 request.go:700] Waited for 1.005379699s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-ac-dockercfg-9lkdf&limit=500&resourceVersion=0 Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.817900 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.837250 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.857110 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.877531 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.897745 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.918013 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.938484 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.958085 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 17 11:15:43 crc kubenswrapper[4742]: I0317 11:15:43.977969 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.017511 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.037900 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.057899 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.077362 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.098053 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.118538 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.137957 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.156970 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.178260 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.198618 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.218256 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.237505 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.258187 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.278202 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.297381 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.317608 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.337496 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.356768 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.376643 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.397786 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.417224 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.437723 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.456340 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.493869 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwhqm\" (UniqueName: \"kubernetes.io/projected/af535295-2114-4275-b62f-3bee0eb830b5-kube-api-access-pwhqm\") pod \"openshift-apiserver-operator-796bbdcf4f-5zpx5\" (UID: \"af535295-2114-4275-b62f-3bee0eb830b5\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.524850 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6pr5\" (UniqueName: \"kubernetes.io/projected/50e9e286-63d8-4081-b085-ad6aa123b560-kube-api-access-r6pr5\") pod \"apiserver-76f77b778f-52v8r\" (UID: \"50e9e286-63d8-4081-b085-ad6aa123b560\") " pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.537467 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqvmz\" (UniqueName: \"kubernetes.io/projected/2afdd196-9364-4f22-a98b-27f4d8602196-kube-api-access-fqvmz\") pod \"cluster-samples-operator-665b6dd947-8pwp5\" (UID: \"2afdd196-9364-4f22-a98b-27f4d8602196\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.537683 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.558040 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.577801 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.597027 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.617737 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.632385 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.667743 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn2r4\" (UniqueName: \"kubernetes.io/projected/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-kube-api-access-xn2r4\") pod \"route-controller-manager-6576b87f9c-4msdd\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.686971 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwvbf\" (UniqueName: \"kubernetes.io/projected/954d6c46-40a1-4d36-b42f-5ef67aba794a-kube-api-access-hwvbf\") pod \"openshift-controller-manager-operator-756b6f6bc6-76tzr\" (UID: \"954d6c46-40a1-4d36-b42f-5ef67aba794a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.706166 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pk7z\" (UniqueName: \"kubernetes.io/projected/497b1f19-025b-4b65-b062-b4a94eec3cfc-kube-api-access-7pk7z\") pod \"controller-manager-879f6c89f-zk827\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.728287 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz56r\" (UniqueName: \"kubernetes.io/projected/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-kube-api-access-nz56r\") pod \"console-f9d7485db-lfdfp\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.737740 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.745220 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxq5r\" (UniqueName: \"kubernetes.io/projected/df58c683-d42a-46c4-9e5e-9b717ddc7956-kube-api-access-rxq5r\") pod \"openshift-config-operator-7777fb866f-8tf9v\" (UID: \"df58c683-d42a-46c4-9e5e-9b717ddc7956\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.767206 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.777359 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.797430 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.805589 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.812360 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.822085 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkt5p\" (UniqueName: \"kubernetes.io/projected/e415e748-23a1-4fdd-80ba-38308aaa4926-kube-api-access-kkt5p\") pod \"dns-operator-744455d44c-9zclv\" (UID: \"e415e748-23a1-4fdd-80ba-38308aaa4926\") " pod="openshift-dns-operator/dns-operator-744455d44c-9zclv" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.832206 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.835501 4742 request.go:700] Waited for 1.87094433s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.837331 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2r26\" (UniqueName: \"kubernetes.io/projected/0de428d9-1755-4c28-8c6e-cbb115aef7c7-kube-api-access-r2r26\") pod \"downloads-7954f5f757-s5z9r\" (UID: \"0de428d9-1755-4c28-8c6e-cbb115aef7c7\") " pod="openshift-console/downloads-7954f5f757-s5z9r" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.858007 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.858134 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjslf\" (UniqueName: \"kubernetes.io/projected/8cd74653-8f7b-446d-8ded-b8816cf3f46a-kube-api-access-rjslf\") pod \"authentication-operator-69f744f599-wdrq6\" (UID: \"8cd74653-8f7b-446d-8ded-b8816cf3f46a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.877161 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.890944 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5"] Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.895814 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.898705 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 17 11:15:44 crc kubenswrapper[4742]: W0317 11:15:44.906213 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf535295_2114_4275_b62f_3bee0eb830b5.slice/crio-0f69e8b3f92e30daf1fb8e751fa442a0f60e9a908e2e60f646e7298939cac9be WatchSource:0}: Error finding container 0f69e8b3f92e30daf1fb8e751fa442a0f60e9a908e2e60f646e7298939cac9be: Status 404 returned error can't find the container with id 0f69e8b3f92e30daf1fb8e751fa442a0f60e9a908e2e60f646e7298939cac9be Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.911218 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-s5z9r" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.918792 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.927717 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.936377 4742 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.942247 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.958688 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.980151 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" Mar 17 11:15:44 crc kubenswrapper[4742]: I0317 11:15:44.994979 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m8ct\" (UniqueName: \"kubernetes.io/projected/2800a131-02e6-49f1-9385-6065c4b4216e-kube-api-access-4m8ct\") pod \"router-default-5444994796-hwx7f\" (UID: \"2800a131-02e6-49f1-9385-6065c4b4216e\") " pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.018456 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scqnc\" (UniqueName: \"kubernetes.io/projected/361582e0-97ed-4927-b83f-642592572dac-kube-api-access-scqnc\") pod \"multus-admission-controller-857f4d67dd-9fkmh\" (UID: \"361582e0-97ed-4927-b83f-642592572dac\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9fkmh" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.035358 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7lcv\" (UniqueName: \"kubernetes.io/projected/f33a63f1-688a-46eb-a32f-5259fa969528-kube-api-access-v7lcv\") pod \"marketplace-operator-79b997595-6n4cr\" (UID: \"f33a63f1-688a-46eb-a32f-5259fa969528\") " pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.035637 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" event={"ID":"af535295-2114-4275-b62f-3bee0eb830b5","Type":"ContainerStarted","Data":"0f69e8b3f92e30daf1fb8e751fa442a0f60e9a908e2e60f646e7298939cac9be"} Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.055956 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh5bk\" (UniqueName: \"kubernetes.io/projected/a63c2414-b309-48e5-95f2-ab1b45577b92-kube-api-access-sh5bk\") pod \"apiserver-7bbb656c7d-hmmrg\" (UID: \"a63c2414-b309-48e5-95f2-ab1b45577b92\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.082427 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brbcz\" (UniqueName: \"kubernetes.io/projected/fa95c069-97da-45bf-ac92-c80160bd8648-kube-api-access-brbcz\") pod \"console-operator-58897d9998-tmn6g\" (UID: \"fa95c069-97da-45bf-ac92-c80160bd8648\") " pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.098094 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9zclv" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.114075 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.142606 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.182070 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.199629 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efd3a468-c0d1-4736-8d34-35448326ade8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xxd4\" (UID: \"efd3a468-c0d1-4736-8d34-35448326ade8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.199675 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f-proxy-tls\") pod \"machine-config-operator-74547568cd-xzn7v\" (UID: \"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.199700 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5a8987e2-189e-4b66-9908-4bccb83f07a6-etcd-ca\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.199723 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fc09c5-43ef-4abe-8e2f-04221dad03c0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zvgjb\" (UID: \"a9fc09c5-43ef-4abe-8e2f-04221dad03c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.199745 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/827bcca5-3d25-4f48-bee8-1f012196617b-config\") pod \"machine-approver-56656f9798-skgwz\" (UID: \"827bcca5-3d25-4f48-bee8-1f012196617b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.199806 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949d92a4-d000-43fe-91b2-c12bc9c86251-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hldfg\" (UID: \"949d92a4-d000-43fe-91b2-c12bc9c86251\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200030 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/827bcca5-3d25-4f48-bee8-1f012196617b-auth-proxy-config\") pod \"machine-approver-56656f9798-skgwz\" (UID: \"827bcca5-3d25-4f48-bee8-1f012196617b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200097 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a8987e2-189e-4b66-9908-4bccb83f07a6-config\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200123 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200147 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efd3a468-c0d1-4736-8d34-35448326ade8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xxd4\" (UID: \"efd3a468-c0d1-4736-8d34-35448326ade8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200204 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200231 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbfsw\" (UniqueName: \"kubernetes.io/projected/76ed03a0-90ee-4e37-9580-d7136a7fdc5e-kube-api-access-bbfsw\") pod \"machine-api-operator-5694c8668f-bc2zs\" (UID: \"76ed03a0-90ee-4e37-9580-d7136a7fdc5e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200252 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/023414d4-9886-49cf-ad52-b876be342763-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4gd2t\" (UID: \"023414d4-9886-49cf-ad52-b876be342763\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200275 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4ce74de-8d87-4ad4-9a30-d96f45ac21b5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wbm7l\" (UID: \"b4ce74de-8d87-4ad4-9a30-d96f45ac21b5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200310 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/61b81f5a-30d8-4c88-899b-5effb490bdee-audit-dir\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200334 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x6mr\" (UniqueName: \"kubernetes.io/projected/827bcca5-3d25-4f48-bee8-1f012196617b-kube-api-access-6x6mr\") pod \"machine-approver-56656f9798-skgwz\" (UID: \"827bcca5-3d25-4f48-bee8-1f012196617b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200357 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp7f7\" (UniqueName: \"kubernetes.io/projected/1c60b7f0-a64f-4968-a678-66b6cd89dc97-kube-api-access-lp7f7\") pod \"ingress-operator-5b745b69d9-2f4j6\" (UID: \"1c60b7f0-a64f-4968-a678-66b6cd89dc97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200623 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-registry-tls\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200652 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/76ed03a0-90ee-4e37-9580-d7136a7fdc5e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bc2zs\" (UID: \"76ed03a0-90ee-4e37-9580-d7136a7fdc5e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200710 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9fc09c5-43ef-4abe-8e2f-04221dad03c0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zvgjb\" (UID: \"a9fc09c5-43ef-4abe-8e2f-04221dad03c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200744 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200769 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5wsb\" (UniqueName: \"kubernetes.io/projected/382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f-kube-api-access-n5wsb\") pod \"machine-config-operator-74547568cd-xzn7v\" (UID: \"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200793 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200826 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/957049a3-8921-4ec9-a66c-d0fe15848fad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-t2nj8\" (UID: \"957049a3-8921-4ec9-a66c-d0fe15848fad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200863 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b38516a-3938-421e-9191-03786c23318c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200898 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b38516a-3938-421e-9191-03786c23318c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200940 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/023414d4-9886-49cf-ad52-b876be342763-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4gd2t\" (UID: \"023414d4-9886-49cf-ad52-b876be342763\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200964 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.200985 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kll7c\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-kube-api-access-kll7c\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201007 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xzn7v\" (UID: \"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201030 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76ed03a0-90ee-4e37-9580-d7136a7fdc5e-config\") pod \"machine-api-operator-5694c8668f-bc2zs\" (UID: \"76ed03a0-90ee-4e37-9580-d7136a7fdc5e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201051 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a8987e2-189e-4b66-9908-4bccb83f07a6-etcd-service-ca\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201076 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvhnt\" (UniqueName: \"kubernetes.io/projected/61b81f5a-30d8-4c88-899b-5effb490bdee-kube-api-access-jvhnt\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201110 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/76ed03a0-90ee-4e37-9580-d7136a7fdc5e-images\") pod \"machine-api-operator-5694c8668f-bc2zs\" (UID: \"76ed03a0-90ee-4e37-9580-d7136a7fdc5e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201134 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7jgp\" (UniqueName: \"kubernetes.io/projected/efd3a468-c0d1-4736-8d34-35448326ade8-kube-api-access-c7jgp\") pod \"cluster-image-registry-operator-dc59b4c8b-8xxd4\" (UID: \"efd3a468-c0d1-4736-8d34-35448326ade8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201161 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201187 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/949d92a4-d000-43fe-91b2-c12bc9c86251-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hldfg\" (UID: \"949d92a4-d000-43fe-91b2-c12bc9c86251\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201211 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f-images\") pod \"machine-config-operator-74547568cd-xzn7v\" (UID: \"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201404 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sjh8\" (UniqueName: \"kubernetes.io/projected/d5eebf39-d75b-460d-9d32-de0eca5b904d-kube-api-access-9sjh8\") pod \"migrator-59844c95c7-hg7ln\" (UID: \"d5eebf39-d75b-460d-9d32-de0eca5b904d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hg7ln" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201446 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4ce74de-8d87-4ad4-9a30-d96f45ac21b5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wbm7l\" (UID: \"b4ce74de-8d87-4ad4-9a30-d96f45ac21b5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201467 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx24z\" (UniqueName: \"kubernetes.io/projected/957049a3-8921-4ec9-a66c-d0fe15848fad-kube-api-access-qx24z\") pod \"control-plane-machine-set-operator-78cbb6b69f-t2nj8\" (UID: \"957049a3-8921-4ec9-a66c-d0fe15848fad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201487 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c60b7f0-a64f-4968-a678-66b6cd89dc97-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2f4j6\" (UID: \"1c60b7f0-a64f-4968-a678-66b6cd89dc97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201506 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b38516a-3938-421e-9191-03786c23318c-trusted-ca\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201537 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fc09c5-43ef-4abe-8e2f-04221dad03c0-config\") pod \"kube-controller-manager-operator-78b949d7b-zvgjb\" (UID: \"a9fc09c5-43ef-4abe-8e2f-04221dad03c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201555 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201587 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ce74de-8d87-4ad4-9a30-d96f45ac21b5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wbm7l\" (UID: \"b4ce74de-8d87-4ad4-9a30-d96f45ac21b5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201645 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzgvg\" (UniqueName: \"kubernetes.io/projected/5a8987e2-189e-4b66-9908-4bccb83f07a6-kube-api-access-pzgvg\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201671 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201691 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a8987e2-189e-4b66-9908-4bccb83f07a6-serving-cert\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201723 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c60b7f0-a64f-4968-a678-66b6cd89dc97-trusted-ca\") pod \"ingress-operator-5b745b69d9-2f4j6\" (UID: \"1c60b7f0-a64f-4968-a678-66b6cd89dc97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201746 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/efd3a468-c0d1-4736-8d34-35448326ade8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xxd4\" (UID: \"efd3a468-c0d1-4736-8d34-35448326ade8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201766 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023414d4-9886-49cf-ad52-b876be342763-config\") pod \"kube-apiserver-operator-766d6c64bb-4gd2t\" (UID: \"023414d4-9886-49cf-ad52-b876be342763\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201786 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a8987e2-189e-4b66-9908-4bccb83f07a6-etcd-client\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201808 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201829 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-bound-sa-token\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201851 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201890 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201937 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb7mx\" (UniqueName: \"kubernetes.io/projected/949d92a4-d000-43fe-91b2-c12bc9c86251-kube-api-access-gb7mx\") pod \"kube-storage-version-migrator-operator-b67b599dd-hldfg\" (UID: \"949d92a4-d000-43fe-91b2-c12bc9c86251\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.201960 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c60b7f0-a64f-4968-a678-66b6cd89dc97-metrics-tls\") pod \"ingress-operator-5b745b69d9-2f4j6\" (UID: \"1c60b7f0-a64f-4968-a678-66b6cd89dc97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.202005 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b38516a-3938-421e-9191-03786c23318c-registry-certificates\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.202030 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-audit-policies\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.202053 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.202086 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/827bcca5-3d25-4f48-bee8-1f012196617b-machine-approver-tls\") pod \"machine-approver-56656f9798-skgwz\" (UID: \"827bcca5-3d25-4f48-bee8-1f012196617b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:45 crc kubenswrapper[4742]: E0317 11:15:45.205794 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:45.705778606 +0000 UTC m=+248.831906364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.207952 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.229269 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9fkmh" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.260582 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wdrq6"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.267987 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-52v8r"] Mar 17 11:15:45 crc kubenswrapper[4742]: W0317 11:15:45.294808 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cd74653_8f7b_446d_8ded_b8816cf3f46a.slice/crio-97857a718a2dd4c2fd6ac8a4a67ed45b0fb179142be3131597e7c206b01fa166 WatchSource:0}: Error finding container 97857a718a2dd4c2fd6ac8a4a67ed45b0fb179142be3131597e7c206b01fa166: Status 404 returned error can't find the container with id 97857a718a2dd4c2fd6ac8a4a67ed45b0fb179142be3131597e7c206b01fa166 Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.309794 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:45 crc kubenswrapper[4742]: E0317 11:15:45.310337 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:45.810263152 +0000 UTC m=+248.936390920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310411 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c60b7f0-a64f-4968-a678-66b6cd89dc97-trusted-ca\") pod \"ingress-operator-5b745b69d9-2f4j6\" (UID: \"1c60b7f0-a64f-4968-a678-66b6cd89dc97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310454 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/efd3a468-c0d1-4736-8d34-35448326ade8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xxd4\" (UID: \"efd3a468-c0d1-4736-8d34-35448326ade8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310478 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023414d4-9886-49cf-ad52-b876be342763-config\") pod \"kube-apiserver-operator-766d6c64bb-4gd2t\" (UID: \"023414d4-9886-49cf-ad52-b876be342763\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310500 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a8987e2-189e-4b66-9908-4bccb83f07a6-etcd-client\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310571 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bdcl\" (UniqueName: \"kubernetes.io/projected/a8d92daa-fa91-4eaa-9699-91459f58d17d-kube-api-access-5bdcl\") pod \"machine-config-server-7khn5\" (UID: \"a8d92daa-fa91-4eaa-9699-91459f58d17d\") " pod="openshift-machine-config-operator/machine-config-server-7khn5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310619 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310671 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-bound-sa-token\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310696 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310737 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8d92daa-fa91-4eaa-9699-91459f58d17d-node-bootstrap-token\") pod \"machine-config-server-7khn5\" (UID: \"a8d92daa-fa91-4eaa-9699-91459f58d17d\") " pod="openshift-machine-config-operator/machine-config-server-7khn5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310768 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310792 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb7mx\" (UniqueName: \"kubernetes.io/projected/949d92a4-d000-43fe-91b2-c12bc9c86251-kube-api-access-gb7mx\") pod \"kube-storage-version-migrator-operator-b67b599dd-hldfg\" (UID: \"949d92a4-d000-43fe-91b2-c12bc9c86251\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310814 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c60b7f0-a64f-4968-a678-66b6cd89dc97-metrics-tls\") pod \"ingress-operator-5b745b69d9-2f4j6\" (UID: \"1c60b7f0-a64f-4968-a678-66b6cd89dc97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310835 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/aec28323-bc80-4975-a4c8-c3bd9a05c356-tmpfs\") pod \"packageserver-d55dfcdfc-gm8lg\" (UID: \"aec28323-bc80-4975-a4c8-c3bd9a05c356\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310860 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b38516a-3938-421e-9191-03786c23318c-registry-certificates\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310881 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-audit-policies\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310937 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310960 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrzqz\" (UniqueName: \"kubernetes.io/projected/8a19595a-7833-40fa-a836-f87d3a294f86-kube-api-access-xrzqz\") pod \"service-ca-9c57cc56f-z2csl\" (UID: \"8a19595a-7833-40fa-a836-f87d3a294f86\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.310999 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/827bcca5-3d25-4f48-bee8-1f012196617b-machine-approver-tls\") pod \"machine-approver-56656f9798-skgwz\" (UID: \"827bcca5-3d25-4f48-bee8-1f012196617b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311020 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efd3a468-c0d1-4736-8d34-35448326ade8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xxd4\" (UID: \"efd3a468-c0d1-4736-8d34-35448326ade8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311056 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f-proxy-tls\") pod \"machine-config-operator-74547568cd-xzn7v\" (UID: \"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311076 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5a8987e2-189e-4b66-9908-4bccb83f07a6-etcd-ca\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311098 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fc09c5-43ef-4abe-8e2f-04221dad03c0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zvgjb\" (UID: \"a9fc09c5-43ef-4abe-8e2f-04221dad03c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311120 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/827bcca5-3d25-4f48-bee8-1f012196617b-config\") pod \"machine-approver-56656f9798-skgwz\" (UID: \"827bcca5-3d25-4f48-bee8-1f012196617b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311147 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx2t9\" (UniqueName: \"kubernetes.io/projected/74651634-b893-441a-9e3c-18a8eaeafcfa-kube-api-access-sx2t9\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311174 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45e5ba25-cee1-4d34-8fff-a006d7cbbd5c-srv-cert\") pod \"olm-operator-6b444d44fb-kq8zp\" (UID: \"45e5ba25-cee1-4d34-8fff-a006d7cbbd5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311213 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949d92a4-d000-43fe-91b2-c12bc9c86251-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hldfg\" (UID: \"949d92a4-d000-43fe-91b2-c12bc9c86251\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311250 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmtvq\" (UniqueName: \"kubernetes.io/projected/aec28323-bc80-4975-a4c8-c3bd9a05c356-kube-api-access-mmtvq\") pod \"packageserver-d55dfcdfc-gm8lg\" (UID: \"aec28323-bc80-4975-a4c8-c3bd9a05c356\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311276 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86f6s\" (UniqueName: \"kubernetes.io/projected/5b3a8612-a5db-4ec8-9873-32829e2fe69e-kube-api-access-86f6s\") pod \"auto-csr-approver-29562434-wtx87\" (UID: \"5b3a8612-a5db-4ec8-9873-32829e2fe69e\") " pod="openshift-infra/auto-csr-approver-29562434-wtx87" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311301 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpsq7\" (UniqueName: \"kubernetes.io/projected/df1d9cfc-5349-44ec-bed8-ac71aa7741d1-kube-api-access-hpsq7\") pod \"service-ca-operator-777779d784-nsx27\" (UID: \"df1d9cfc-5349-44ec-bed8-ac71aa7741d1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311340 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/827bcca5-3d25-4f48-bee8-1f012196617b-auth-proxy-config\") pod \"machine-approver-56656f9798-skgwz\" (UID: \"827bcca5-3d25-4f48-bee8-1f012196617b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311376 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a8987e2-189e-4b66-9908-4bccb83f07a6-config\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311400 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3c93c286-d743-4512-a19b-baf28a72cd77-profile-collector-cert\") pod \"catalog-operator-68c6474976-2ftt5\" (UID: \"3c93c286-d743-4512-a19b-baf28a72cd77\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311427 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d98zd\" (UniqueName: \"kubernetes.io/projected/b007618f-b075-4f57-85e0-5fa8e89bc1bb-kube-api-access-d98zd\") pod \"ingress-canary-w4g9d\" (UID: \"b007618f-b075-4f57-85e0-5fa8e89bc1bb\") " pod="openshift-ingress-canary/ingress-canary-w4g9d" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311452 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45e5ba25-cee1-4d34-8fff-a006d7cbbd5c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kq8zp\" (UID: \"45e5ba25-cee1-4d34-8fff-a006d7cbbd5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311476 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311498 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efd3a468-c0d1-4736-8d34-35448326ade8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xxd4\" (UID: \"efd3a468-c0d1-4736-8d34-35448326ade8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311521 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-plugins-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311558 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311582 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbfsw\" (UniqueName: \"kubernetes.io/projected/76ed03a0-90ee-4e37-9580-d7136a7fdc5e-kube-api-access-bbfsw\") pod \"machine-api-operator-5694c8668f-bc2zs\" (UID: \"76ed03a0-90ee-4e37-9580-d7136a7fdc5e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311604 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/023414d4-9886-49cf-ad52-b876be342763-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4gd2t\" (UID: \"023414d4-9886-49cf-ad52-b876be342763\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311625 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4ce74de-8d87-4ad4-9a30-d96f45ac21b5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wbm7l\" (UID: \"b4ce74de-8d87-4ad4-9a30-d96f45ac21b5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311646 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-socket-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311714 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/61b81f5a-30d8-4c88-899b-5effb490bdee-audit-dir\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311739 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x6mr\" (UniqueName: \"kubernetes.io/projected/827bcca5-3d25-4f48-bee8-1f012196617b-kube-api-access-6x6mr\") pod \"machine-approver-56656f9798-skgwz\" (UID: \"827bcca5-3d25-4f48-bee8-1f012196617b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311760 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp7f7\" (UniqueName: \"kubernetes.io/projected/1c60b7f0-a64f-4968-a678-66b6cd89dc97-kube-api-access-lp7f7\") pod \"ingress-operator-5b745b69d9-2f4j6\" (UID: \"1c60b7f0-a64f-4968-a678-66b6cd89dc97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311780 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75629956-e407-4638-90cd-fd2f907bb0fb-config-volume\") pod \"collect-profiles-29562435-hmhmr\" (UID: \"75629956-e407-4638-90cd-fd2f907bb0fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311801 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aec28323-bc80-4975-a4c8-c3bd9a05c356-apiservice-cert\") pod \"packageserver-d55dfcdfc-gm8lg\" (UID: \"aec28323-bc80-4975-a4c8-c3bd9a05c356\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311822 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbt7b\" (UniqueName: \"kubernetes.io/projected/75629956-e407-4638-90cd-fd2f907bb0fb-kube-api-access-zbt7b\") pod \"collect-profiles-29562435-hmhmr\" (UID: \"75629956-e407-4638-90cd-fd2f907bb0fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311845 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-registry-tls\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311866 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/76ed03a0-90ee-4e37-9580-d7136a7fdc5e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bc2zs\" (UID: \"76ed03a0-90ee-4e37-9580-d7136a7fdc5e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311891 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7bb0b6b-532a-4492-9fa3-c24db5074886-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cw5v4\" (UID: \"a7bb0b6b-532a-4492-9fa3-c24db5074886\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311951 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba5a493d-557e-4551-a13f-bf257f49623b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-87n9v\" (UID: \"ba5a493d-557e-4551-a13f-bf257f49623b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.311992 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8d92daa-fa91-4eaa-9699-91459f58d17d-certs\") pod \"machine-config-server-7khn5\" (UID: \"a8d92daa-fa91-4eaa-9699-91459f58d17d\") " pod="openshift-machine-config-operator/machine-config-server-7khn5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312027 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9fc09c5-43ef-4abe-8e2f-04221dad03c0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zvgjb\" (UID: \"a9fc09c5-43ef-4abe-8e2f-04221dad03c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312051 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba5a493d-557e-4551-a13f-bf257f49623b-proxy-tls\") pod \"machine-config-controller-84d6567774-87n9v\" (UID: \"ba5a493d-557e-4551-a13f-bf257f49623b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312076 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5wsb\" (UniqueName: \"kubernetes.io/projected/382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f-kube-api-access-n5wsb\") pod \"machine-config-operator-74547568cd-xzn7v\" (UID: \"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312100 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312128 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312173 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/957049a3-8921-4ec9-a66c-d0fe15848fad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-t2nj8\" (UID: \"957049a3-8921-4ec9-a66c-d0fe15848fad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312197 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b38516a-3938-421e-9191-03786c23318c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312219 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-csi-data-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312242 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b38516a-3938-421e-9191-03786c23318c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312264 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/023414d4-9886-49cf-ad52-b876be342763-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4gd2t\" (UID: \"023414d4-9886-49cf-ad52-b876be342763\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312288 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312325 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kll7c\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-kube-api-access-kll7c\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312350 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xzn7v\" (UID: \"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312372 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76ed03a0-90ee-4e37-9580-d7136a7fdc5e-config\") pod \"machine-api-operator-5694c8668f-bc2zs\" (UID: \"76ed03a0-90ee-4e37-9580-d7136a7fdc5e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312393 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a8987e2-189e-4b66-9908-4bccb83f07a6-etcd-service-ca\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312413 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3c93c286-d743-4512-a19b-baf28a72cd77-srv-cert\") pod \"catalog-operator-68c6474976-2ftt5\" (UID: \"3c93c286-d743-4512-a19b-baf28a72cd77\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312483 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8a19595a-7833-40fa-a836-f87d3a294f86-signing-key\") pod \"service-ca-9c57cc56f-z2csl\" (UID: \"8a19595a-7833-40fa-a836-f87d3a294f86\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312490 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c60b7f0-a64f-4968-a678-66b6cd89dc97-trusted-ca\") pod \"ingress-operator-5b745b69d9-2f4j6\" (UID: \"1c60b7f0-a64f-4968-a678-66b6cd89dc97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312523 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvhnt\" (UniqueName: \"kubernetes.io/projected/61b81f5a-30d8-4c88-899b-5effb490bdee-kube-api-access-jvhnt\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312551 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7jgp\" (UniqueName: \"kubernetes.io/projected/efd3a468-c0d1-4736-8d34-35448326ade8-kube-api-access-c7jgp\") pod \"cluster-image-registry-operator-dc59b4c8b-8xxd4\" (UID: \"efd3a468-c0d1-4736-8d34-35448326ade8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312575 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08eb3fa4-f651-4ba7-b367-1cc1e684398c-config-volume\") pod \"dns-default-58scf\" (UID: \"08eb3fa4-f651-4ba7-b367-1cc1e684398c\") " pod="openshift-dns/dns-default-58scf" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312596 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aec28323-bc80-4975-a4c8-c3bd9a05c356-webhook-cert\") pod \"packageserver-d55dfcdfc-gm8lg\" (UID: \"aec28323-bc80-4975-a4c8-c3bd9a05c356\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312635 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/76ed03a0-90ee-4e37-9580-d7136a7fdc5e-images\") pod \"machine-api-operator-5694c8668f-bc2zs\" (UID: \"76ed03a0-90ee-4e37-9580-d7136a7fdc5e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312671 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz762\" (UniqueName: \"kubernetes.io/projected/ba5a493d-557e-4551-a13f-bf257f49623b-kube-api-access-kz762\") pod \"machine-config-controller-84d6567774-87n9v\" (UID: \"ba5a493d-557e-4551-a13f-bf257f49623b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312707 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312731 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/949d92a4-d000-43fe-91b2-c12bc9c86251-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hldfg\" (UID: \"949d92a4-d000-43fe-91b2-c12bc9c86251\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312754 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/08eb3fa4-f651-4ba7-b367-1cc1e684398c-metrics-tls\") pod \"dns-default-58scf\" (UID: \"08eb3fa4-f651-4ba7-b367-1cc1e684398c\") " pod="openshift-dns/dns-default-58scf" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312779 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f-images\") pod \"machine-config-operator-74547568cd-xzn7v\" (UID: \"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312817 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75629956-e407-4638-90cd-fd2f907bb0fb-secret-volume\") pod \"collect-profiles-29562435-hmhmr\" (UID: \"75629956-e407-4638-90cd-fd2f907bb0fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312861 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sjh8\" (UniqueName: \"kubernetes.io/projected/d5eebf39-d75b-460d-9d32-de0eca5b904d-kube-api-access-9sjh8\") pod \"migrator-59844c95c7-hg7ln\" (UID: \"d5eebf39-d75b-460d-9d32-de0eca5b904d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hg7ln" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.312972 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxdk7\" (UniqueName: \"kubernetes.io/projected/08eb3fa4-f651-4ba7-b367-1cc1e684398c-kube-api-access-hxdk7\") pod \"dns-default-58scf\" (UID: \"08eb3fa4-f651-4ba7-b367-1cc1e684398c\") " pod="openshift-dns/dns-default-58scf" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.313001 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dplvr\" (UniqueName: \"kubernetes.io/projected/45e5ba25-cee1-4d34-8fff-a006d7cbbd5c-kube-api-access-dplvr\") pod \"olm-operator-6b444d44fb-kq8zp\" (UID: \"45e5ba25-cee1-4d34-8fff-a006d7cbbd5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.313029 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnslh\" (UniqueName: \"kubernetes.io/projected/a7bb0b6b-532a-4492-9fa3-c24db5074886-kube-api-access-cnslh\") pod \"package-server-manager-789f6589d5-cw5v4\" (UID: \"a7bb0b6b-532a-4492-9fa3-c24db5074886\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.313078 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2s8c\" (UniqueName: \"kubernetes.io/projected/3c93c286-d743-4512-a19b-baf28a72cd77-kube-api-access-h2s8c\") pod \"catalog-operator-68c6474976-2ftt5\" (UID: \"3c93c286-d743-4512-a19b-baf28a72cd77\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.313100 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b007618f-b075-4f57-85e0-5fa8e89bc1bb-cert\") pod \"ingress-canary-w4g9d\" (UID: \"b007618f-b075-4f57-85e0-5fa8e89bc1bb\") " pod="openshift-ingress-canary/ingress-canary-w4g9d" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.313123 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4ce74de-8d87-4ad4-9a30-d96f45ac21b5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wbm7l\" (UID: \"b4ce74de-8d87-4ad4-9a30-d96f45ac21b5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.313147 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df1d9cfc-5349-44ec-bed8-ac71aa7741d1-config\") pod \"service-ca-operator-777779d784-nsx27\" (UID: \"df1d9cfc-5349-44ec-bed8-ac71aa7741d1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.313169 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c60b7f0-a64f-4968-a678-66b6cd89dc97-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2f4j6\" (UID: \"1c60b7f0-a64f-4968-a678-66b6cd89dc97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.313190 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8a19595a-7833-40fa-a836-f87d3a294f86-signing-cabundle\") pod \"service-ca-9c57cc56f-z2csl\" (UID: \"8a19595a-7833-40fa-a836-f87d3a294f86\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.313211 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-mountpoint-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.313237 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx24z\" (UniqueName: \"kubernetes.io/projected/957049a3-8921-4ec9-a66c-d0fe15848fad-kube-api-access-qx24z\") pod \"control-plane-machine-set-operator-78cbb6b69f-t2nj8\" (UID: \"957049a3-8921-4ec9-a66c-d0fe15848fad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.313691 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/61b81f5a-30d8-4c88-899b-5effb490bdee-audit-dir\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.313259 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-registration-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.319110 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b38516a-3938-421e-9191-03786c23318c-trusted-ca\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.319284 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fc09c5-43ef-4abe-8e2f-04221dad03c0-config\") pod \"kube-controller-manager-operator-78b949d7b-zvgjb\" (UID: \"a9fc09c5-43ef-4abe-8e2f-04221dad03c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.319374 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.319546 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ce74de-8d87-4ad4-9a30-d96f45ac21b5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wbm7l\" (UID: \"b4ce74de-8d87-4ad4-9a30-d96f45ac21b5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.319633 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzgvg\" (UniqueName: \"kubernetes.io/projected/5a8987e2-189e-4b66-9908-4bccb83f07a6-kube-api-access-pzgvg\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.319712 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.319798 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a8987e2-189e-4b66-9908-4bccb83f07a6-serving-cert\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.319886 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df1d9cfc-5349-44ec-bed8-ac71aa7741d1-serving-cert\") pod \"service-ca-operator-777779d784-nsx27\" (UID: \"df1d9cfc-5349-44ec-bed8-ac71aa7741d1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.321998 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.322195 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/023414d4-9886-49cf-ad52-b876be342763-config\") pod \"kube-apiserver-operator-766d6c64bb-4gd2t\" (UID: \"023414d4-9886-49cf-ad52-b876be342763\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.322648 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5a8987e2-189e-4b66-9908-4bccb83f07a6-etcd-client\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.323009 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: E0317 11:15:45.325103 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:45.824881055 +0000 UTC m=+248.951008813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.326347 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-registry-tls\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.327531 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5a8987e2-189e-4b66-9908-4bccb83f07a6-etcd-service-ca\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.327610 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xzn7v\" (UID: \"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.328144 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f-images\") pod \"machine-config-operator-74547568cd-xzn7v\" (UID: \"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.328639 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a8987e2-189e-4b66-9908-4bccb83f07a6-config\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.330305 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/827bcca5-3d25-4f48-bee8-1f012196617b-auth-proxy-config\") pod \"machine-approver-56656f9798-skgwz\" (UID: \"827bcca5-3d25-4f48-bee8-1f012196617b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.331013 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949d92a4-d000-43fe-91b2-c12bc9c86251-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hldfg\" (UID: \"949d92a4-d000-43fe-91b2-c12bc9c86251\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.331084 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/949d92a4-d000-43fe-91b2-c12bc9c86251-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hldfg\" (UID: \"949d92a4-d000-43fe-91b2-c12bc9c86251\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.331511 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/76ed03a0-90ee-4e37-9580-d7136a7fdc5e-images\") pod \"machine-api-operator-5694c8668f-bc2zs\" (UID: \"76ed03a0-90ee-4e37-9580-d7136a7fdc5e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.331826 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.331983 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-audit-policies\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.332181 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b38516a-3938-421e-9191-03786c23318c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.334893 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fc09c5-43ef-4abe-8e2f-04221dad03c0-config\") pod \"kube-controller-manager-operator-78b949d7b-zvgjb\" (UID: \"a9fc09c5-43ef-4abe-8e2f-04221dad03c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.335825 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76ed03a0-90ee-4e37-9580-d7136a7fdc5e-config\") pod \"machine-api-operator-5694c8668f-bc2zs\" (UID: \"76ed03a0-90ee-4e37-9580-d7136a7fdc5e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.335874 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.336794 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b38516a-3938-421e-9191-03786c23318c-trusted-ca\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.339139 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.339192 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/efd3a468-c0d1-4736-8d34-35448326ade8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xxd4\" (UID: \"efd3a468-c0d1-4736-8d34-35448326ade8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.339826 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/827bcca5-3d25-4f48-bee8-1f012196617b-config\") pod \"machine-approver-56656f9798-skgwz\" (UID: \"827bcca5-3d25-4f48-bee8-1f012196617b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.340069 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.340347 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f-proxy-tls\") pod \"machine-config-operator-74547568cd-xzn7v\" (UID: \"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.340759 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c60b7f0-a64f-4968-a678-66b6cd89dc97-metrics-tls\") pod \"ingress-operator-5b745b69d9-2f4j6\" (UID: \"1c60b7f0-a64f-4968-a678-66b6cd89dc97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.340989 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b38516a-3938-421e-9191-03786c23318c-registry-certificates\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.342359 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4ce74de-8d87-4ad4-9a30-d96f45ac21b5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wbm7l\" (UID: \"b4ce74de-8d87-4ad4-9a30-d96f45ac21b5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.343117 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/efd3a468-c0d1-4736-8d34-35448326ade8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xxd4\" (UID: \"efd3a468-c0d1-4736-8d34-35448326ade8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.344840 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5a8987e2-189e-4b66-9908-4bccb83f07a6-etcd-ca\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.351132 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.351445 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.352074 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.355537 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fc09c5-43ef-4abe-8e2f-04221dad03c0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-zvgjb\" (UID: \"a9fc09c5-43ef-4abe-8e2f-04221dad03c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.355833 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/827bcca5-3d25-4f48-bee8-1f012196617b-machine-approver-tls\") pod \"machine-approver-56656f9798-skgwz\" (UID: \"827bcca5-3d25-4f48-bee8-1f012196617b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.360992 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4ce74de-8d87-4ad4-9a30-d96f45ac21b5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wbm7l\" (UID: \"b4ce74de-8d87-4ad4-9a30-d96f45ac21b5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.361355 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4ce74de-8d87-4ad4-9a30-d96f45ac21b5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wbm7l\" (UID: \"b4ce74de-8d87-4ad4-9a30-d96f45ac21b5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.361488 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/957049a3-8921-4ec9-a66c-d0fe15848fad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-t2nj8\" (UID: \"957049a3-8921-4ec9-a66c-d0fe15848fad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.361532 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/023414d4-9886-49cf-ad52-b876be342763-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4gd2t\" (UID: \"023414d4-9886-49cf-ad52-b876be342763\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.362326 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.362712 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b38516a-3938-421e-9191-03786c23318c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.362983 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.363378 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/76ed03a0-90ee-4e37-9580-d7136a7fdc5e-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bc2zs\" (UID: \"76ed03a0-90ee-4e37-9580-d7136a7fdc5e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.365586 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a8987e2-189e-4b66-9908-4bccb83f07a6-serving-cert\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.371061 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zk827"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.386463 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.383622 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/023414d4-9886-49cf-ad52-b876be342763-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4gd2t\" (UID: \"023414d4-9886-49cf-ad52-b876be342763\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.395086 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9zclv"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.400074 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.400867 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efd3a468-c0d1-4736-8d34-35448326ade8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xxd4\" (UID: \"efd3a468-c0d1-4736-8d34-35448326ade8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:45 crc kubenswrapper[4742]: W0317 11:15:45.407349 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ee15c68_88ae_4ca8_b3d5_94266082d7ba.slice/crio-8219a95caeb835f4d08cd802c62ee95dbfaae8be8387e93b26b2805f50dcd3c3 WatchSource:0}: Error finding container 8219a95caeb835f4d08cd802c62ee95dbfaae8be8387e93b26b2805f50dcd3c3: Status 404 returned error can't find the container with id 8219a95caeb835f4d08cd802c62ee95dbfaae8be8387e93b26b2805f50dcd3c3 Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.407383 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.413977 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x6mr\" (UniqueName: \"kubernetes.io/projected/827bcca5-3d25-4f48-bee8-1f012196617b-kube-api-access-6x6mr\") pod \"machine-approver-56656f9798-skgwz\" (UID: \"827bcca5-3d25-4f48-bee8-1f012196617b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.420688 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.420931 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxdk7\" (UniqueName: \"kubernetes.io/projected/08eb3fa4-f651-4ba7-b367-1cc1e684398c-kube-api-access-hxdk7\") pod \"dns-default-58scf\" (UID: \"08eb3fa4-f651-4ba7-b367-1cc1e684398c\") " pod="openshift-dns/dns-default-58scf" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.420952 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnslh\" (UniqueName: \"kubernetes.io/projected/a7bb0b6b-532a-4492-9fa3-c24db5074886-kube-api-access-cnslh\") pod \"package-server-manager-789f6589d5-cw5v4\" (UID: \"a7bb0b6b-532a-4492-9fa3-c24db5074886\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.420972 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dplvr\" (UniqueName: \"kubernetes.io/projected/45e5ba25-cee1-4d34-8fff-a006d7cbbd5c-kube-api-access-dplvr\") pod \"olm-operator-6b444d44fb-kq8zp\" (UID: \"45e5ba25-cee1-4d34-8fff-a006d7cbbd5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" Mar 17 11:15:45 crc kubenswrapper[4742]: E0317 11:15:45.421025 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:45.92100153 +0000 UTC m=+249.047129288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421086 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df1d9cfc-5349-44ec-bed8-ac71aa7741d1-config\") pod \"service-ca-operator-777779d784-nsx27\" (UID: \"df1d9cfc-5349-44ec-bed8-ac71aa7741d1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421110 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2s8c\" (UniqueName: \"kubernetes.io/projected/3c93c286-d743-4512-a19b-baf28a72cd77-kube-api-access-h2s8c\") pod \"catalog-operator-68c6474976-2ftt5\" (UID: \"3c93c286-d743-4512-a19b-baf28a72cd77\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421128 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b007618f-b075-4f57-85e0-5fa8e89bc1bb-cert\") pod \"ingress-canary-w4g9d\" (UID: \"b007618f-b075-4f57-85e0-5fa8e89bc1bb\") " pod="openshift-ingress-canary/ingress-canary-w4g9d" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421156 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8a19595a-7833-40fa-a836-f87d3a294f86-signing-cabundle\") pod \"service-ca-9c57cc56f-z2csl\" (UID: \"8a19595a-7833-40fa-a836-f87d3a294f86\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421174 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-mountpoint-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421190 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-registration-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421222 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df1d9cfc-5349-44ec-bed8-ac71aa7741d1-serving-cert\") pod \"service-ca-operator-777779d784-nsx27\" (UID: \"df1d9cfc-5349-44ec-bed8-ac71aa7741d1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421259 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bdcl\" (UniqueName: \"kubernetes.io/projected/a8d92daa-fa91-4eaa-9699-91459f58d17d-kube-api-access-5bdcl\") pod \"machine-config-server-7khn5\" (UID: \"a8d92daa-fa91-4eaa-9699-91459f58d17d\") " pod="openshift-machine-config-operator/machine-config-server-7khn5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421288 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8d92daa-fa91-4eaa-9699-91459f58d17d-node-bootstrap-token\") pod \"machine-config-server-7khn5\" (UID: \"a8d92daa-fa91-4eaa-9699-91459f58d17d\") " pod="openshift-machine-config-operator/machine-config-server-7khn5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421311 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/aec28323-bc80-4975-a4c8-c3bd9a05c356-tmpfs\") pod \"packageserver-d55dfcdfc-gm8lg\" (UID: \"aec28323-bc80-4975-a4c8-c3bd9a05c356\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421336 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrzqz\" (UniqueName: \"kubernetes.io/projected/8a19595a-7833-40fa-a836-f87d3a294f86-kube-api-access-xrzqz\") pod \"service-ca-9c57cc56f-z2csl\" (UID: \"8a19595a-7833-40fa-a836-f87d3a294f86\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421364 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx2t9\" (UniqueName: \"kubernetes.io/projected/74651634-b893-441a-9e3c-18a8eaeafcfa-kube-api-access-sx2t9\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421386 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45e5ba25-cee1-4d34-8fff-a006d7cbbd5c-srv-cert\") pod \"olm-operator-6b444d44fb-kq8zp\" (UID: \"45e5ba25-cee1-4d34-8fff-a006d7cbbd5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421413 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmtvq\" (UniqueName: \"kubernetes.io/projected/aec28323-bc80-4975-a4c8-c3bd9a05c356-kube-api-access-mmtvq\") pod \"packageserver-d55dfcdfc-gm8lg\" (UID: \"aec28323-bc80-4975-a4c8-c3bd9a05c356\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421438 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86f6s\" (UniqueName: \"kubernetes.io/projected/5b3a8612-a5db-4ec8-9873-32829e2fe69e-kube-api-access-86f6s\") pod \"auto-csr-approver-29562434-wtx87\" (UID: \"5b3a8612-a5db-4ec8-9873-32829e2fe69e\") " pod="openshift-infra/auto-csr-approver-29562434-wtx87" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421460 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpsq7\" (UniqueName: \"kubernetes.io/projected/df1d9cfc-5349-44ec-bed8-ac71aa7741d1-kube-api-access-hpsq7\") pod \"service-ca-operator-777779d784-nsx27\" (UID: \"df1d9cfc-5349-44ec-bed8-ac71aa7741d1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421491 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3c93c286-d743-4512-a19b-baf28a72cd77-profile-collector-cert\") pod \"catalog-operator-68c6474976-2ftt5\" (UID: \"3c93c286-d743-4512-a19b-baf28a72cd77\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421512 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d98zd\" (UniqueName: \"kubernetes.io/projected/b007618f-b075-4f57-85e0-5fa8e89bc1bb-kube-api-access-d98zd\") pod \"ingress-canary-w4g9d\" (UID: \"b007618f-b075-4f57-85e0-5fa8e89bc1bb\") " pod="openshift-ingress-canary/ingress-canary-w4g9d" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421539 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45e5ba25-cee1-4d34-8fff-a006d7cbbd5c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kq8zp\" (UID: \"45e5ba25-cee1-4d34-8fff-a006d7cbbd5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421561 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-plugins-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421611 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-socket-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421651 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75629956-e407-4638-90cd-fd2f907bb0fb-config-volume\") pod \"collect-profiles-29562435-hmhmr\" (UID: \"75629956-e407-4638-90cd-fd2f907bb0fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421675 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aec28323-bc80-4975-a4c8-c3bd9a05c356-apiservice-cert\") pod \"packageserver-d55dfcdfc-gm8lg\" (UID: \"aec28323-bc80-4975-a4c8-c3bd9a05c356\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421688 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df1d9cfc-5349-44ec-bed8-ac71aa7741d1-config\") pod \"service-ca-operator-777779d784-nsx27\" (UID: \"df1d9cfc-5349-44ec-bed8-ac71aa7741d1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421702 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbt7b\" (UniqueName: \"kubernetes.io/projected/75629956-e407-4638-90cd-fd2f907bb0fb-kube-api-access-zbt7b\") pod \"collect-profiles-29562435-hmhmr\" (UID: \"75629956-e407-4638-90cd-fd2f907bb0fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421765 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7bb0b6b-532a-4492-9fa3-c24db5074886-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cw5v4\" (UID: \"a7bb0b6b-532a-4492-9fa3-c24db5074886\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421794 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba5a493d-557e-4551-a13f-bf257f49623b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-87n9v\" (UID: \"ba5a493d-557e-4551-a13f-bf257f49623b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421816 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8d92daa-fa91-4eaa-9699-91459f58d17d-certs\") pod \"machine-config-server-7khn5\" (UID: \"a8d92daa-fa91-4eaa-9699-91459f58d17d\") " pod="openshift-machine-config-operator/machine-config-server-7khn5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421846 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba5a493d-557e-4551-a13f-bf257f49623b-proxy-tls\") pod \"machine-config-controller-84d6567774-87n9v\" (UID: \"ba5a493d-557e-4551-a13f-bf257f49623b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421871 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.421948 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-csi-data-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.422192 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8a19595a-7833-40fa-a836-f87d3a294f86-signing-cabundle\") pod \"service-ca-9c57cc56f-z2csl\" (UID: \"8a19595a-7833-40fa-a836-f87d3a294f86\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.422479 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3c93c286-d743-4512-a19b-baf28a72cd77-srv-cert\") pod \"catalog-operator-68c6474976-2ftt5\" (UID: \"3c93c286-d743-4512-a19b-baf28a72cd77\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.422511 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8a19595a-7833-40fa-a836-f87d3a294f86-signing-key\") pod \"service-ca-9c57cc56f-z2csl\" (UID: \"8a19595a-7833-40fa-a836-f87d3a294f86\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.422535 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08eb3fa4-f651-4ba7-b367-1cc1e684398c-config-volume\") pod \"dns-default-58scf\" (UID: \"08eb3fa4-f651-4ba7-b367-1cc1e684398c\") " pod="openshift-dns/dns-default-58scf" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.422550 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aec28323-bc80-4975-a4c8-c3bd9a05c356-webhook-cert\") pod \"packageserver-d55dfcdfc-gm8lg\" (UID: \"aec28323-bc80-4975-a4c8-c3bd9a05c356\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.422604 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz762\" (UniqueName: \"kubernetes.io/projected/ba5a493d-557e-4551-a13f-bf257f49623b-kube-api-access-kz762\") pod \"machine-config-controller-84d6567774-87n9v\" (UID: \"ba5a493d-557e-4551-a13f-bf257f49623b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.422628 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/08eb3fa4-f651-4ba7-b367-1cc1e684398c-metrics-tls\") pod \"dns-default-58scf\" (UID: \"08eb3fa4-f651-4ba7-b367-1cc1e684398c\") " pod="openshift-dns/dns-default-58scf" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.422646 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75629956-e407-4638-90cd-fd2f907bb0fb-secret-volume\") pod \"collect-profiles-29562435-hmhmr\" (UID: \"75629956-e407-4638-90cd-fd2f907bb0fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.422777 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-registration-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.423843 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-mountpoint-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.423919 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-socket-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.424203 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-plugins-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.424438 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75629956-e407-4638-90cd-fd2f907bb0fb-config-volume\") pod \"collect-profiles-29562435-hmhmr\" (UID: \"75629956-e407-4638-90cd-fd2f907bb0fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.424522 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/74651634-b893-441a-9e3c-18a8eaeafcfa-csi-data-dir\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.424651 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ba5a493d-557e-4551-a13f-bf257f49623b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-87n9v\" (UID: \"ba5a493d-557e-4551-a13f-bf257f49623b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" Mar 17 11:15:45 crc kubenswrapper[4742]: E0317 11:15:45.424955 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:45.924940243 +0000 UTC m=+249.051068001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.425980 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b007618f-b075-4f57-85e0-5fa8e89bc1bb-cert\") pod \"ingress-canary-w4g9d\" (UID: \"b007618f-b075-4f57-85e0-5fa8e89bc1bb\") " pod="openshift-ingress-canary/ingress-canary-w4g9d" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.427314 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08eb3fa4-f651-4ba7-b367-1cc1e684398c-config-volume\") pod \"dns-default-58scf\" (UID: \"08eb3fa4-f651-4ba7-b367-1cc1e684398c\") " pod="openshift-dns/dns-default-58scf" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.427597 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/aec28323-bc80-4975-a4c8-c3bd9a05c356-tmpfs\") pod \"packageserver-d55dfcdfc-gm8lg\" (UID: \"aec28323-bc80-4975-a4c8-c3bd9a05c356\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.427973 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75629956-e407-4638-90cd-fd2f907bb0fb-secret-volume\") pod \"collect-profiles-29562435-hmhmr\" (UID: \"75629956-e407-4638-90cd-fd2f907bb0fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.435852 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a8d92daa-fa91-4eaa-9699-91459f58d17d-certs\") pod \"machine-config-server-7khn5\" (UID: \"a8d92daa-fa91-4eaa-9699-91459f58d17d\") " pod="openshift-machine-config-operator/machine-config-server-7khn5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.439385 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df1d9cfc-5349-44ec-bed8-ac71aa7741d1-serving-cert\") pod \"service-ca-operator-777779d784-nsx27\" (UID: \"df1d9cfc-5349-44ec-bed8-ac71aa7741d1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.440201 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45e5ba25-cee1-4d34-8fff-a006d7cbbd5c-srv-cert\") pod \"olm-operator-6b444d44fb-kq8zp\" (UID: \"45e5ba25-cee1-4d34-8fff-a006d7cbbd5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.440259 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/08eb3fa4-f651-4ba7-b367-1cc1e684398c-metrics-tls\") pod \"dns-default-58scf\" (UID: \"08eb3fa4-f651-4ba7-b367-1cc1e684398c\") " pod="openshift-dns/dns-default-58scf" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.444076 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3c93c286-d743-4512-a19b-baf28a72cd77-profile-collector-cert\") pod \"catalog-operator-68c6474976-2ftt5\" (UID: \"3c93c286-d743-4512-a19b-baf28a72cd77\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.444076 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45e5ba25-cee1-4d34-8fff-a006d7cbbd5c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kq8zp\" (UID: \"45e5ba25-cee1-4d34-8fff-a006d7cbbd5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.444511 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/aec28323-bc80-4975-a4c8-c3bd9a05c356-webhook-cert\") pod \"packageserver-d55dfcdfc-gm8lg\" (UID: \"aec28323-bc80-4975-a4c8-c3bd9a05c356\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.445668 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7bb0b6b-532a-4492-9fa3-c24db5074886-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cw5v4\" (UID: \"a7bb0b6b-532a-4492-9fa3-c24db5074886\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.446521 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp7f7\" (UniqueName: \"kubernetes.io/projected/1c60b7f0-a64f-4968-a678-66b6cd89dc97-kube-api-access-lp7f7\") pod \"ingress-operator-5b745b69d9-2f4j6\" (UID: \"1c60b7f0-a64f-4968-a678-66b6cd89dc97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.446650 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ba5a493d-557e-4551-a13f-bf257f49623b-proxy-tls\") pod \"machine-config-controller-84d6567774-87n9v\" (UID: \"ba5a493d-557e-4551-a13f-bf257f49623b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.447346 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/aec28323-bc80-4975-a4c8-c3bd9a05c356-apiservice-cert\") pod \"packageserver-d55dfcdfc-gm8lg\" (UID: \"aec28323-bc80-4975-a4c8-c3bd9a05c356\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.447739 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3c93c286-d743-4512-a19b-baf28a72cd77-srv-cert\") pod \"catalog-operator-68c6474976-2ftt5\" (UID: \"3c93c286-d743-4512-a19b-baf28a72cd77\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.448134 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8a19595a-7833-40fa-a836-f87d3a294f86-signing-key\") pod \"service-ca-9c57cc56f-z2csl\" (UID: \"8a19595a-7833-40fa-a836-f87d3a294f86\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.450597 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a8d92daa-fa91-4eaa-9699-91459f58d17d-node-bootstrap-token\") pod \"machine-config-server-7khn5\" (UID: \"a8d92daa-fa91-4eaa-9699-91459f58d17d\") " pod="openshift-machine-config-operator/machine-config-server-7khn5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.455968 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.456568 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbfsw\" (UniqueName: \"kubernetes.io/projected/76ed03a0-90ee-4e37-9580-d7136a7fdc5e-kube-api-access-bbfsw\") pod \"machine-api-operator-5694c8668f-bc2zs\" (UID: \"76ed03a0-90ee-4e37-9580-d7136a7fdc5e\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.474694 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-bound-sa-token\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.484292 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-lfdfp"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.491760 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.496744 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-s5z9r"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.496758 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzgvg\" (UniqueName: \"kubernetes.io/projected/5a8987e2-189e-4b66-9908-4bccb83f07a6-kube-api-access-pzgvg\") pod \"etcd-operator-b45778765-qtcq5\" (UID: \"5a8987e2-189e-4b66-9908-4bccb83f07a6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: W0317 11:15:45.501346 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcb66d58_3d7a_47db_b3ff_2ede326cbe34.slice/crio-e9d3aae0219dafebe6c09c6791b8113653e912cf485f713fee521bccffa165fc WatchSource:0}: Error finding container e9d3aae0219dafebe6c09c6791b8113653e912cf485f713fee521bccffa165fc: Status 404 returned error can't find the container with id e9d3aae0219dafebe6c09c6791b8113653e912cf485f713fee521bccffa165fc Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.512782 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kll7c\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-kube-api-access-kll7c\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.514153 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.523137 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:45 crc kubenswrapper[4742]: E0317 11:15:45.523295 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.023274673 +0000 UTC m=+249.149402431 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.523402 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: E0317 11:15:45.523977 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.023962093 +0000 UTC m=+249.150089851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.529538 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvhnt\" (UniqueName: \"kubernetes.io/projected/61b81f5a-30d8-4c88-899b-5effb490bdee-kube-api-access-jvhnt\") pod \"oauth-openshift-558db77b4-spkdx\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.535722 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.550139 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7jgp\" (UniqueName: \"kubernetes.io/projected/efd3a468-c0d1-4736-8d34-35448326ade8-kube-api-access-c7jgp\") pod \"cluster-image-registry-operator-dc59b4c8b-8xxd4\" (UID: \"efd3a468-c0d1-4736-8d34-35448326ade8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.569525 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sjh8\" (UniqueName: \"kubernetes.io/projected/d5eebf39-d75b-460d-9d32-de0eca5b904d-kube-api-access-9sjh8\") pod \"migrator-59844c95c7-hg7ln\" (UID: \"d5eebf39-d75b-460d-9d32-de0eca5b904d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hg7ln" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.597837 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c60b7f0-a64f-4968-a678-66b6cd89dc97-bound-sa-token\") pod \"ingress-operator-5b745b69d9-2f4j6\" (UID: \"1c60b7f0-a64f-4968-a678-66b6cd89dc97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.615277 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx24z\" (UniqueName: \"kubernetes.io/projected/957049a3-8921-4ec9-a66c-d0fe15848fad-kube-api-access-qx24z\") pod \"control-plane-machine-set-operator-78cbb6b69f-t2nj8\" (UID: \"957049a3-8921-4ec9-a66c-d0fe15848fad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.624364 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:45 crc kubenswrapper[4742]: E0317 11:15:45.624504 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.124477565 +0000 UTC m=+249.250605323 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.624835 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: E0317 11:15:45.625089 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.125081462 +0000 UTC m=+249.251209220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.631942 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb7mx\" (UniqueName: \"kubernetes.io/projected/949d92a4-d000-43fe-91b2-c12bc9c86251-kube-api-access-gb7mx\") pod \"kube-storage-version-migrator-operator-b67b599dd-hldfg\" (UID: \"949d92a4-d000-43fe-91b2-c12bc9c86251\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.649325 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.652178 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tmn6g"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.685771 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5wsb\" (UniqueName: \"kubernetes.io/projected/382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f-kube-api-access-n5wsb\") pod \"machine-config-operator-74547568cd-xzn7v\" (UID: \"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.689799 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a9fc09c5-43ef-4abe-8e2f-04221dad03c0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-zvgjb\" (UID: \"a9fc09c5-43ef-4abe-8e2f-04221dad03c0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.712934 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxdk7\" (UniqueName: \"kubernetes.io/projected/08eb3fa4-f651-4ba7-b367-1cc1e684398c-kube-api-access-hxdk7\") pod \"dns-default-58scf\" (UID: \"08eb3fa4-f651-4ba7-b367-1cc1e684398c\") " pod="openshift-dns/dns-default-58scf" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.720699 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.725862 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6n4cr"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.725942 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9fkmh"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.726338 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:45 crc kubenswrapper[4742]: E0317 11:15:45.726722 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.226706017 +0000 UTC m=+249.352833775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.728239 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.731586 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnslh\" (UniqueName: \"kubernetes.io/projected/a7bb0b6b-532a-4492-9fa3-c24db5074886-kube-api-access-cnslh\") pod \"package-server-manager-789f6589d5-cw5v4\" (UID: \"a7bb0b6b-532a-4492-9fa3-c24db5074886\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.736188 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" Mar 17 11:15:45 crc kubenswrapper[4742]: W0317 11:15:45.746494 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf33a63f1_688a_46eb_a32f_5259fa969528.slice/crio-a0cfa0b1e3062ca7ba5b72dc6b75c65c2f3f3b476a1faba51de5910fe5691f5c WatchSource:0}: Error finding container a0cfa0b1e3062ca7ba5b72dc6b75c65c2f3f3b476a1faba51de5910fe5691f5c: Status 404 returned error can't find the container with id a0cfa0b1e3062ca7ba5b72dc6b75c65c2f3f3b476a1faba51de5910fe5691f5c Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.748936 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.755761 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dplvr\" (UniqueName: \"kubernetes.io/projected/45e5ba25-cee1-4d34-8fff-a006d7cbbd5c-kube-api-access-dplvr\") pod \"olm-operator-6b444d44fb-kq8zp\" (UID: \"45e5ba25-cee1-4d34-8fff-a006d7cbbd5c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.779419 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.782882 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2s8c\" (UniqueName: \"kubernetes.io/projected/3c93c286-d743-4512-a19b-baf28a72cd77-kube-api-access-h2s8c\") pod \"catalog-operator-68c6474976-2ftt5\" (UID: \"3c93c286-d743-4512-a19b-baf28a72cd77\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.786621 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.793968 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.794612 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86f6s\" (UniqueName: \"kubernetes.io/projected/5b3a8612-a5db-4ec8-9873-32829e2fe69e-kube-api-access-86f6s\") pod \"auto-csr-approver-29562434-wtx87\" (UID: \"5b3a8612-a5db-4ec8-9873-32829e2fe69e\") " pod="openshift-infra/auto-csr-approver-29562434-wtx87" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.800198 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.831344 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.831679 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:45 crc kubenswrapper[4742]: E0317 11:15:45.832034 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.332017457 +0000 UTC m=+249.458145215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.838031 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmtvq\" (UniqueName: \"kubernetes.io/projected/aec28323-bc80-4975-a4c8-c3bd9a05c356-kube-api-access-mmtvq\") pod \"packageserver-d55dfcdfc-gm8lg\" (UID: \"aec28323-bc80-4975-a4c8-c3bd9a05c356\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.843456 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hg7ln" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.848769 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l"] Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.850406 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.851077 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbt7b\" (UniqueName: \"kubernetes.io/projected/75629956-e407-4638-90cd-fd2f907bb0fb-kube-api-access-zbt7b\") pod \"collect-profiles-29562435-hmhmr\" (UID: \"75629956-e407-4638-90cd-fd2f907bb0fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.851459 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpsq7\" (UniqueName: \"kubernetes.io/projected/df1d9cfc-5349-44ec-bed8-ac71aa7741d1-kube-api-access-hpsq7\") pod \"service-ca-operator-777779d784-nsx27\" (UID: \"df1d9cfc-5349-44ec-bed8-ac71aa7741d1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" Mar 17 11:15:45 crc kubenswrapper[4742]: W0317 11:15:45.854095 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod023414d4_9886_49cf_ad52_b876be342763.slice/crio-df62836ac38fb2fa17c4125d0b8ee704b3c7be047fe4d5299b545db580d04734 WatchSource:0}: Error finding container df62836ac38fb2fa17c4125d0b8ee704b3c7be047fe4d5299b545db580d04734: Status 404 returned error can't find the container with id df62836ac38fb2fa17c4125d0b8ee704b3c7be047fe4d5299b545db580d04734 Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.871373 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.876696 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrzqz\" (UniqueName: \"kubernetes.io/projected/8a19595a-7833-40fa-a836-f87d3a294f86-kube-api-access-xrzqz\") pod \"service-ca-9c57cc56f-z2csl\" (UID: \"8a19595a-7833-40fa-a836-f87d3a294f86\") " pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.877979 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.887313 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.895461 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.896829 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx2t9\" (UniqueName: \"kubernetes.io/projected/74651634-b893-441a-9e3c-18a8eaeafcfa-kube-api-access-sx2t9\") pod \"csi-hostpathplugin-cpjwx\" (UID: \"74651634-b893-441a-9e3c-18a8eaeafcfa\") " pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.905131 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.917152 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bdcl\" (UniqueName: \"kubernetes.io/projected/a8d92daa-fa91-4eaa-9699-91459f58d17d-kube-api-access-5bdcl\") pod \"machine-config-server-7khn5\" (UID: \"a8d92daa-fa91-4eaa-9699-91459f58d17d\") " pod="openshift-machine-config-operator/machine-config-server-7khn5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.927098 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562434-wtx87" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.933806 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.934248 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" Mar 17 11:15:45 crc kubenswrapper[4742]: E0317 11:15:45.934600 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.434561228 +0000 UTC m=+249.560688986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.953571 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d98zd\" (UniqueName: \"kubernetes.io/projected/b007618f-b075-4f57-85e0-5fa8e89bc1bb-kube-api-access-d98zd\") pod \"ingress-canary-w4g9d\" (UID: \"b007618f-b075-4f57-85e0-5fa8e89bc1bb\") " pod="openshift-ingress-canary/ingress-canary-w4g9d" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.953817 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.958946 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7khn5" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.960583 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz762\" (UniqueName: \"kubernetes.io/projected/ba5a493d-557e-4551-a13f-bf257f49623b-kube-api-access-kz762\") pod \"machine-config-controller-84d6567774-87n9v\" (UID: \"ba5a493d-557e-4551-a13f-bf257f49623b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.970050 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-58scf" Mar 17 11:15:45 crc kubenswrapper[4742]: I0317 11:15:45.977351 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.041771 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:46 crc kubenswrapper[4742]: E0317 11:15:46.042701 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.54268722 +0000 UTC m=+249.668814978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.073869 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" event={"ID":"af535295-2114-4275-b62f-3bee0eb830b5","Type":"ContainerStarted","Data":"aad12009d969ece0326781c9de51c3e9a3a8d3e61aa276e29786ab5c71fbf0c3"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.093045 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" event={"ID":"b4ce74de-8d87-4ad4-9a30-d96f45ac21b5","Type":"ContainerStarted","Data":"ec26b26936ec53246c3edc770da1905b32408ffbec95ed01dc47d564c7310dc7"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.098215 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" event={"ID":"df58c683-d42a-46c4-9e5e-9b717ddc7956","Type":"ContainerStarted","Data":"48dc413756e945f59cf62d2c26c23fa3ebe2f342e9e69cabd0cb4bf4baad0d77"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.098245 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" event={"ID":"df58c683-d42a-46c4-9e5e-9b717ddc7956","Type":"ContainerStarted","Data":"4583bba22f33940cef539159c61158ebe85ec9f93090a9340f89e2198f51f1af"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.111010 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" event={"ID":"954d6c46-40a1-4d36-b42f-5ef67aba794a","Type":"ContainerStarted","Data":"8ccb04ab176df5fd613b602aa6a04a54b526287361ede1d806235958d3dc1179"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.111054 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" event={"ID":"954d6c46-40a1-4d36-b42f-5ef67aba794a","Type":"ContainerStarted","Data":"4392eab407b5206ff8dcd193b76cd2faea2debd7b8717a07a17bab6327e27879"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.113096 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lfdfp" event={"ID":"dcb66d58-3d7a-47db-b3ff-2ede326cbe34","Type":"ContainerStarted","Data":"5aac6a5f0a0ffc2a5d93dc1482f58c6abf35598426d69bd6ee761160b0afb22c"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.113123 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lfdfp" event={"ID":"dcb66d58-3d7a-47db-b3ff-2ede326cbe34","Type":"ContainerStarted","Data":"e9d3aae0219dafebe6c09c6791b8113653e912cf485f713fee521bccffa165fc"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.124138 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" event={"ID":"5ee15c68-88ae-4ca8-b3d5-94266082d7ba","Type":"ContainerStarted","Data":"2a3a9577fe0d9db7d18b3b1da9c2681043887f5c0d1c2372f97e42d62b15ea57"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.124199 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" event={"ID":"5ee15c68-88ae-4ca8-b3d5-94266082d7ba","Type":"ContainerStarted","Data":"8219a95caeb835f4d08cd802c62ee95dbfaae8be8387e93b26b2805f50dcd3c3"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.125309 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.131130 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hwx7f" event={"ID":"2800a131-02e6-49f1-9385-6065c4b4216e","Type":"ContainerStarted","Data":"6a226ee6ec753fa39dec7f5cd6646cba23dbfbb062d5fff1fe7e82b7c28b8423"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.131169 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hwx7f" event={"ID":"2800a131-02e6-49f1-9385-6065c4b4216e","Type":"ContainerStarted","Data":"dfba904be62a607518148ee5e90e40c6b3b22deb2a0b231054cbf92a8c8104d1"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.132610 4742 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-4msdd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.132650 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" podUID="5ee15c68-88ae-4ca8-b3d5-94266082d7ba" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.133365 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9zclv" event={"ID":"e415e748-23a1-4fdd-80ba-38308aaa4926","Type":"ContainerStarted","Data":"3dca38cfb6eccd01f496f545b3dc1162fa801a758a4700f983bc5f12f8c5ea90"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.133391 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9zclv" event={"ID":"e415e748-23a1-4fdd-80ba-38308aaa4926","Type":"ContainerStarted","Data":"83d096a543863a207ccdeaa873bd13d3e78a898162b3382040161bcb74d8a53f"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.135933 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tmn6g" event={"ID":"fa95c069-97da-45bf-ac92-c80160bd8648","Type":"ContainerStarted","Data":"e321da8ea8f0c0ac765e939f865f6193f556ea1becfe805818ec4d15b5b5c177"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.135993 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tmn6g" event={"ID":"fa95c069-97da-45bf-ac92-c80160bd8648","Type":"ContainerStarted","Data":"5e28775ee762aa7ce3ddf5d486a7bb2850eb3a7090d8ed8458e4a1cfb3e6a5c6"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.136534 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.143779 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5" event={"ID":"2afdd196-9364-4f22-a98b-27f4d8602196","Type":"ContainerStarted","Data":"af2279dcae18b7429758d609919b5dd15bb6aa179484904f3f1768dd47efef00"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.145767 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5" event={"ID":"2afdd196-9364-4f22-a98b-27f4d8602196","Type":"ContainerStarted","Data":"8340ce7866f7bff3ae6cb68cc2922afac12c1197a2d009d292f9db83e45cf1cd"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.145548 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.144571 4742 patch_prober.go:28] interesting pod/console-operator-58897d9998-tmn6g container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.146091 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tmn6g" podUID="fa95c069-97da-45bf-ac92-c80160bd8648" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 17 11:15:46 crc kubenswrapper[4742]: E0317 11:15:46.145624 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.645609242 +0000 UTC m=+249.771737000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.146213 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:46 crc kubenswrapper[4742]: E0317 11:15:46.149032 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.6490197 +0000 UTC m=+249.775147458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.160583 4742 generic.go:334] "Generic (PLEG): container finished" podID="50e9e286-63d8-4081-b085-ad6aa123b560" containerID="32ea6dbabeabc42b928ab671c0cbe776eec527428eded9114658e3ef407090c5" exitCode=0 Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.160898 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-52v8r" event={"ID":"50e9e286-63d8-4081-b085-ad6aa123b560","Type":"ContainerDied","Data":"32ea6dbabeabc42b928ab671c0cbe776eec527428eded9114658e3ef407090c5"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.160969 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-52v8r" event={"ID":"50e9e286-63d8-4081-b085-ad6aa123b560","Type":"ContainerStarted","Data":"9c8a2afccde915f5e9eb5eaa329ccede65dae2396b4b22ad965f331283b984a2"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.167396 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.185052 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.187844 4742 patch_prober.go:28] interesting pod/router-default-5444994796-hwx7f container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.187898 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hwx7f" podUID="2800a131-02e6-49f1-9385-6065c4b4216e" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.188145 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9fkmh" event={"ID":"361582e0-97ed-4927-b83f-642592572dac","Type":"ContainerStarted","Data":"4a3f74ba61f35db3cebd8c0654e062d49f3d7fbdafa9a72d11756202b3341047"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.189383 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" event={"ID":"023414d4-9886-49cf-ad52-b876be342763","Type":"ContainerStarted","Data":"df62836ac38fb2fa17c4125d0b8ee704b3c7be047fe4d5299b545db580d04734"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.191552 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" event={"ID":"8cd74653-8f7b-446d-8ded-b8816cf3f46a","Type":"ContainerStarted","Data":"79165805cf914b51eb38f9cf9daea5b25012b2cd082c0a194e9fadd77a894881"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.191575 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" event={"ID":"8cd74653-8f7b-446d-8ded-b8816cf3f46a","Type":"ContainerStarted","Data":"97857a718a2dd4c2fd6ac8a4a67ed45b0fb179142be3131597e7c206b01fa166"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.193763 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" event={"ID":"f33a63f1-688a-46eb-a32f-5259fa969528","Type":"ContainerStarted","Data":"a0cfa0b1e3062ca7ba5b72dc6b75c65c2f3f3b476a1faba51de5910fe5691f5c"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.195329 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" event={"ID":"a63c2414-b309-48e5-95f2-ab1b45577b92","Type":"ContainerStarted","Data":"eabe583b2513ed0fc194df448849bcbb18fa657bab04d241467f0e90f577a7d7"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.196578 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" event={"ID":"497b1f19-025b-4b65-b062-b4a94eec3cfc","Type":"ContainerStarted","Data":"0ed2836e2808da8b27ae0271a65ffa592068c830ec52870952512e1c220b8392"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.196604 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" event={"ID":"497b1f19-025b-4b65-b062-b4a94eec3cfc","Type":"ContainerStarted","Data":"3836086a144949811b4cd5e235fb4e6d99ec743b2b998f2653ee87794f7f3e09"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.197873 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.203154 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-s5z9r" event={"ID":"0de428d9-1755-4c28-8c6e-cbb115aef7c7","Type":"ContainerStarted","Data":"be4c26c7df998f83f7c81a099ffc86ba240d9675ab5816a11917bf1304c16326"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.203182 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-s5z9r" event={"ID":"0de428d9-1755-4c28-8c6e-cbb115aef7c7","Type":"ContainerStarted","Data":"464b107ab496ddb2706198c98f4c8e547fc1e193ba15ceb36dab60e6f76003ad"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.204004 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-s5z9r" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.206699 4742 patch_prober.go:28] interesting pod/downloads-7954f5f757-s5z9r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.206737 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-s5z9r" podUID="0de428d9-1755-4c28-8c6e-cbb115aef7c7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.206802 4742 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zk827 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.206826 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" podUID="497b1f19-025b-4b65-b062-b4a94eec3cfc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.237831 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" event={"ID":"827bcca5-3d25-4f48-bee8-1f012196617b","Type":"ContainerStarted","Data":"cc06568ea5254c70a87996c77980b2283e9b3963a5aa487f1ca556927574629b"} Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.247255 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w4g9d" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.248242 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:46 crc kubenswrapper[4742]: E0317 11:15:46.248377 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.748356738 +0000 UTC m=+249.874484486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.248761 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:46 crc kubenswrapper[4742]: E0317 11:15:46.249778 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.749767539 +0000 UTC m=+249.875895307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.349986 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:46 crc kubenswrapper[4742]: E0317 11:15:46.350728 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.850710764 +0000 UTC m=+249.976838522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:46 crc kubenswrapper[4742]: W0317 11:15:46.391174 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8d92daa_fa91_4eaa_9699_91459f58d17d.slice/crio-e1bb246772b030dc002b2f1ba264a41288c96820322e5d74a500d575b3a075d9 WatchSource:0}: Error finding container e1bb246772b030dc002b2f1ba264a41288c96820322e5d74a500d575b3a075d9: Status 404 returned error can't find the container with id e1bb246772b030dc002b2f1ba264a41288c96820322e5d74a500d575b3a075d9 Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.412443 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bc2zs"] Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.412838 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb"] Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.427443 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-spkdx"] Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.452550 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:46 crc kubenswrapper[4742]: E0317 11:15:46.453455 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:46.95343892 +0000 UTC m=+250.079566678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.467810 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4"] Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.529550 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6"] Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.549310 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qtcq5"] Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.555080 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:46 crc kubenswrapper[4742]: E0317 11:15:46.555892 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:47.055866897 +0000 UTC m=+250.181994655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:46 crc kubenswrapper[4742]: W0317 11:15:46.556497 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefd3a468_c0d1_4736_8d34_35448326ade8.slice/crio-635f67c1b18a801b5e0de1f41376e6ae1079811c8d098146a11d862c6fce4d9a WatchSource:0}: Error finding container 635f67c1b18a801b5e0de1f41376e6ae1079811c8d098146a11d862c6fce4d9a: Status 404 returned error can't find the container with id 635f67c1b18a801b5e0de1f41376e6ae1079811c8d098146a11d862c6fce4d9a Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.560810 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:46 crc kubenswrapper[4742]: E0317 11:15:46.562705 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:47.062688024 +0000 UTC m=+250.188815782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.613797 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg"] Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.662401 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:46 crc kubenswrapper[4742]: E0317 11:15:46.663150 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:47.163077153 +0000 UTC m=+250.289204911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.764426 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:46 crc kubenswrapper[4742]: E0317 11:15:46.764811 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:47.264794019 +0000 UTC m=+250.390921777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.866179 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:46 crc kubenswrapper[4742]: E0317 11:15:46.867115 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:47.367096323 +0000 UTC m=+250.493224081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:46 crc kubenswrapper[4742]: W0317 11:15:46.891227 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a8987e2_189e_4b66_9908_4bccb83f07a6.slice/crio-61897acdfb79644aa14f18b8d6ee7ce56db0dd7ed27e546f032e670391ca3f86 WatchSource:0}: Error finding container 61897acdfb79644aa14f18b8d6ee7ce56db0dd7ed27e546f032e670391ca3f86: Status 404 returned error can't find the container with id 61897acdfb79644aa14f18b8d6ee7ce56db0dd7ed27e546f032e670391ca3f86 Mar 17 11:15:46 crc kubenswrapper[4742]: W0317 11:15:46.914121 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod949d92a4_d000_43fe_91b2_c12bc9c86251.slice/crio-19ba42b7eaa5ddb6868fb1095faa11d2e74247734b87ab8eb9421ca68a3b71eb WatchSource:0}: Error finding container 19ba42b7eaa5ddb6868fb1095faa11d2e74247734b87ab8eb9421ca68a3b71eb: Status 404 returned error can't find the container with id 19ba42b7eaa5ddb6868fb1095faa11d2e74247734b87ab8eb9421ca68a3b71eb Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.925446 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-tmn6g" podStartSLOduration=183.925425758 podStartE2EDuration="3m3.925425758s" podCreationTimestamp="2026-03-17 11:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:46.919151507 +0000 UTC m=+250.045279265" watchObservedRunningTime="2026-03-17 11:15:46.925425758 +0000 UTC m=+250.051553516" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.949028 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5"] Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.968571 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-s5z9r" podStartSLOduration=182.968548433 podStartE2EDuration="3m2.968548433s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:46.946472395 +0000 UTC m=+250.072600153" watchObservedRunningTime="2026-03-17 11:15:46.968548433 +0000 UTC m=+250.094676191" Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.973638 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:46 crc kubenswrapper[4742]: E0317 11:15:46.975130 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:47.475117602 +0000 UTC m=+250.601245360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:46 crc kubenswrapper[4742]: I0317 11:15:46.995926 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" podStartSLOduration=182.995895072 podStartE2EDuration="3m2.995895072s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:46.992181185 +0000 UTC m=+250.118308943" watchObservedRunningTime="2026-03-17 11:15:46.995895072 +0000 UTC m=+250.122022830" Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.079796 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:47 crc kubenswrapper[4742]: E0317 11:15:47.080004 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:47.57998166 +0000 UTC m=+250.706109408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.080173 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:47 crc kubenswrapper[4742]: E0317 11:15:47.080579 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:47.580565827 +0000 UTC m=+250.706693585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.125274 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8"] Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.155697 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-76tzr" podStartSLOduration=183.155681025 podStartE2EDuration="3m3.155681025s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:47.153630457 +0000 UTC m=+250.279758215" watchObservedRunningTime="2026-03-17 11:15:47.155681025 +0000 UTC m=+250.281808783" Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.186184 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:47 crc kubenswrapper[4742]: E0317 11:15:47.186540 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:47.686525866 +0000 UTC m=+250.812653624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.197049 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" podStartSLOduration=183.19703425 podStartE2EDuration="3m3.19703425s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:47.195452304 +0000 UTC m=+250.321580062" watchObservedRunningTime="2026-03-17 11:15:47.19703425 +0000 UTC m=+250.323162008" Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.202640 4742 patch_prober.go:28] interesting pod/router-default-5444994796-hwx7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 11:15:47 crc kubenswrapper[4742]: [-]has-synced failed: reason withheld Mar 17 11:15:47 crc kubenswrapper[4742]: [+]process-running ok Mar 17 11:15:47 crc kubenswrapper[4742]: healthz check failed Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.202700 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hwx7f" podUID="2800a131-02e6-49f1-9385-6065c4b4216e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 11:15:47 crc kubenswrapper[4742]: W0317 11:15:47.211721 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c93c286_d743_4512_a19b_baf28a72cd77.slice/crio-cd4e9b0c4e2414de56112ede9583d51b1231f0cf7da720ec484b50ca9f86ee1a WatchSource:0}: Error finding container cd4e9b0c4e2414de56112ede9583d51b1231f0cf7da720ec484b50ca9f86ee1a: Status 404 returned error can't find the container with id cd4e9b0c4e2414de56112ede9583d51b1231f0cf7da720ec484b50ca9f86ee1a Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.238745 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-hwx7f" podStartSLOduration=183.238727704 podStartE2EDuration="3m3.238727704s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:47.235546051 +0000 UTC m=+250.361673819" watchObservedRunningTime="2026-03-17 11:15:47.238727704 +0000 UTC m=+250.364855462" Mar 17 11:15:47 crc kubenswrapper[4742]: W0317 11:15:47.239479 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod957049a3_8921_4ec9_a66c_d0fe15848fad.slice/crio-ecf20f5d897f977b60fe3899456b9529be6128eb0adcef0c78db67382987d441 WatchSource:0}: Error finding container ecf20f5d897f977b60fe3899456b9529be6128eb0adcef0c78db67382987d441: Status 404 returned error can't find the container with id ecf20f5d897f977b60fe3899456b9529be6128eb0adcef0c78db67382987d441 Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.287580 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:47 crc kubenswrapper[4742]: E0317 11:15:47.288157 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:47.788145661 +0000 UTC m=+250.914273419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.301127 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v"] Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.308716 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5" event={"ID":"2afdd196-9364-4f22-a98b-27f4d8602196","Type":"ContainerStarted","Data":"5124d1b3ab9276066488ed1e5465608429fe546cda5633efaec07840d5de0768"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.321782 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-hg7ln"] Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.322246 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" event={"ID":"949d92a4-d000-43fe-91b2-c12bc9c86251","Type":"ContainerStarted","Data":"19ba42b7eaa5ddb6868fb1095faa11d2e74247734b87ab8eb9421ca68a3b71eb"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.343732 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" event={"ID":"1c60b7f0-a64f-4968-a678-66b6cd89dc97","Type":"ContainerStarted","Data":"1809ab3881b33f6e87949b4adf56a5220ada48a26ca631561900c1d63f66ad95"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.368982 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" event={"ID":"023414d4-9886-49cf-ad52-b876be342763","Type":"ContainerStarted","Data":"9fe41355020c32328665e8fe7fac82e00d78cfe2542ae4b1ce3adac726858a2b"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.377445 4742 generic.go:334] "Generic (PLEG): container finished" podID="df58c683-d42a-46c4-9e5e-9b717ddc7956" containerID="48dc413756e945f59cf62d2c26c23fa3ebe2f342e9e69cabd0cb4bf4baad0d77" exitCode=0 Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.377784 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" event={"ID":"df58c683-d42a-46c4-9e5e-9b717ddc7956","Type":"ContainerDied","Data":"48dc413756e945f59cf62d2c26c23fa3ebe2f342e9e69cabd0cb4bf4baad0d77"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.394342 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:47 crc kubenswrapper[4742]: E0317 11:15:47.394449 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:47.8944342 +0000 UTC m=+251.020561958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.394699 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:47 crc kubenswrapper[4742]: E0317 11:15:47.395216 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:47.895198422 +0000 UTC m=+251.021326180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.412800 4742 generic.go:334] "Generic (PLEG): container finished" podID="a63c2414-b309-48e5-95f2-ab1b45577b92" containerID="985930b2aad7601f9ebb29556e04fa829f5299db0eca3e38dcc0f0d7f2e9bc04" exitCode=0 Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.412979 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" event={"ID":"a63c2414-b309-48e5-95f2-ab1b45577b92","Type":"ContainerDied","Data":"985930b2aad7601f9ebb29556e04fa829f5299db0eca3e38dcc0f0d7f2e9bc04"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.424722 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" event={"ID":"3c93c286-d743-4512-a19b-baf28a72cd77","Type":"ContainerStarted","Data":"cd4e9b0c4e2414de56112ede9583d51b1231f0cf7da720ec484b50ca9f86ee1a"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.438157 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" event={"ID":"76ed03a0-90ee-4e37-9580-d7136a7fdc5e","Type":"ContainerStarted","Data":"3ae50d49be986e6ec0eb4fca9d24a421b8a20d60d60e808b6af19896ec08ee32"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.438449 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" event={"ID":"76ed03a0-90ee-4e37-9580-d7136a7fdc5e","Type":"ContainerStarted","Data":"f2294e972bd10623361efb694ec1be566d2a0dfca37867aa8d88cf065626110f"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.478869 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" event={"ID":"5a8987e2-189e-4b66-9908-4bccb83f07a6","Type":"ContainerStarted","Data":"61897acdfb79644aa14f18b8d6ee7ce56db0dd7ed27e546f032e670391ca3f86"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.492602 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" event={"ID":"a9fc09c5-43ef-4abe-8e2f-04221dad03c0","Type":"ContainerStarted","Data":"27396f6d88b23cf84f3c73ba547d6eb7f6b4faa54391203314575e0f4ad950e8"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.494031 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9fkmh" event={"ID":"361582e0-97ed-4927-b83f-642592572dac","Type":"ContainerStarted","Data":"1e01be70dc95c29dfb148180e042cef93b513c9ac23e4c1a5774a74c81b217c8"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.494808 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" event={"ID":"61b81f5a-30d8-4c88-899b-5effb490bdee","Type":"ContainerStarted","Data":"9064259a5caadf070f835f74744e684fe009d596a72c7a92e1c453129c9dcfb7"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.495267 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:47 crc kubenswrapper[4742]: E0317 11:15:47.495877 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:47.995862338 +0000 UTC m=+251.121990096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.545104 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-lfdfp" podStartSLOduration=183.54508657 podStartE2EDuration="3m3.54508657s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:47.544225615 +0000 UTC m=+250.670353373" watchObservedRunningTime="2026-03-17 11:15:47.54508657 +0000 UTC m=+250.671214328" Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.596285 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:47 crc kubenswrapper[4742]: E0317 11:15:47.596576 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:48.096566215 +0000 UTC m=+251.222693973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.627317 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" event={"ID":"827bcca5-3d25-4f48-bee8-1f012196617b","Type":"ContainerStarted","Data":"24950c9a7165f902fd8c8c34728ff68971364ab525478854ec4f35c016250632"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.638596 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5zpx5" podStartSLOduration=184.638580389 podStartE2EDuration="3m4.638580389s" podCreationTimestamp="2026-03-17 11:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:47.637397114 +0000 UTC m=+250.763524882" watchObservedRunningTime="2026-03-17 11:15:47.638580389 +0000 UTC m=+250.764708147" Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.638939 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" event={"ID":"efd3a468-c0d1-4736-8d34-35448326ade8","Type":"ContainerStarted","Data":"635f67c1b18a801b5e0de1f41376e6ae1079811c8d098146a11d862c6fce4d9a"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.653588 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7khn5" event={"ID":"a8d92daa-fa91-4eaa-9699-91459f58d17d","Type":"ContainerStarted","Data":"a7530bfc104de413b90acbe3b88bbdfc35a3283df02fdd04276f655ebf880449"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.653800 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7khn5" event={"ID":"a8d92daa-fa91-4eaa-9699-91459f58d17d","Type":"ContainerStarted","Data":"e1bb246772b030dc002b2f1ba264a41288c96820322e5d74a500d575b3a075d9"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.670372 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4"] Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.673385 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" event={"ID":"f33a63f1-688a-46eb-a32f-5259fa969528","Type":"ContainerStarted","Data":"6bd100bf1cb3a9dd55bab92415f41ddfff93a7f78e8b9d3b6325876762e27138"} Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.675082 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.696090 4742 patch_prober.go:28] interesting pod/downloads-7954f5f757-s5z9r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.696158 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-s5z9r" podUID="0de428d9-1755-4c28-8c6e-cbb115aef7c7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.696861 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.697102 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:47 crc kubenswrapper[4742]: E0317 11:15:47.698854 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:48.198838558 +0000 UTC m=+251.324966316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.709432 4742 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6n4cr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.709802 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" podUID="f33a63f1-688a-46eb-a32f-5259fa969528" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.712947 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nsx27"] Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.773574 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.787019 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr"] Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.790178 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg"] Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.798753 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:47 crc kubenswrapper[4742]: E0317 11:15:47.799149 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:48.299137675 +0000 UTC m=+251.425265423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.800025 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cpjwx"] Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.824512 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-58scf"] Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.827437 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z2csl"] Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.843898 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wdrq6" podStartSLOduration=184.843882307 podStartE2EDuration="3m4.843882307s" podCreationTimestamp="2026-03-17 11:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:47.83464462 +0000 UTC m=+250.960772378" watchObservedRunningTime="2026-03-17 11:15:47.843882307 +0000 UTC m=+250.970010065" Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.887217 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562434-wtx87"] Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.899571 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp"] Mar 17 11:15:47 crc kubenswrapper[4742]: W0317 11:15:47.900027 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74651634_b893_441a_9e3c_18a8eaeafcfa.slice/crio-4b3cea39c7d693e2f04ece0146c3e387a5c9c7d904a1a60002e3663428ebd477 WatchSource:0}: Error finding container 4b3cea39c7d693e2f04ece0146c3e387a5c9c7d904a1a60002e3663428ebd477: Status 404 returned error can't find the container with id 4b3cea39c7d693e2f04ece0146c3e387a5c9c7d904a1a60002e3663428ebd477 Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.906024 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:47 crc kubenswrapper[4742]: E0317 11:15:47.906250 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:48.406227546 +0000 UTC m=+251.532355304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.916447 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:47 crc kubenswrapper[4742]: E0317 11:15:47.917118 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:48.417103801 +0000 UTC m=+251.543231559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.975269 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 11:15:47 crc kubenswrapper[4742]: I0317 11:15:47.995681 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" podStartSLOduration=183.995650678 podStartE2EDuration="3m3.995650678s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:47.9614033 +0000 UTC m=+251.087531058" watchObservedRunningTime="2026-03-17 11:15:47.995650678 +0000 UTC m=+251.121778436" Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.016058 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w4g9d"] Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.016076 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" podStartSLOduration=184.016050768 podStartE2EDuration="3m4.016050768s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:48.005344908 +0000 UTC m=+251.131472666" watchObservedRunningTime="2026-03-17 11:15:48.016050768 +0000 UTC m=+251.142178526" Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.018539 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:48 crc kubenswrapper[4742]: E0317 11:15:48.018888 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:48.518871799 +0000 UTC m=+251.644999557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.051230 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.051551 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.052704 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7khn5" podStartSLOduration=6.052685305 podStartE2EDuration="6.052685305s" podCreationTimestamp="2026-03-17 11:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:48.02339797 +0000 UTC m=+251.149525718" watchObservedRunningTime="2026-03-17 11:15:48.052685305 +0000 UTC m=+251.178813063" Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.055794 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v"] Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.079483 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" podStartSLOduration=185.079451968 podStartE2EDuration="3m5.079451968s" podCreationTimestamp="2026-03-17 11:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:48.079188211 +0000 UTC m=+251.205315969" watchObservedRunningTime="2026-03-17 11:15:48.079451968 +0000 UTC m=+251.205579726" Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.120485 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:48 crc kubenswrapper[4742]: E0317 11:15:48.120925 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:48.620895435 +0000 UTC m=+251.747023193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:48 crc kubenswrapper[4742]: W0317 11:15:48.129017 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb007618f_b075_4f57_85e0_5fa8e89bc1bb.slice/crio-6c76b9afd2e54c4cc3f053b867bbcd9e2cde9d508845568c29b4caaa741e441e WatchSource:0}: Error finding container 6c76b9afd2e54c4cc3f053b867bbcd9e2cde9d508845568c29b4caaa741e441e: Status 404 returned error can't find the container with id 6c76b9afd2e54c4cc3f053b867bbcd9e2cde9d508845568c29b4caaa741e441e Mar 17 11:15:48 crc kubenswrapper[4742]: W0317 11:15:48.156461 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba5a493d_557e_4551_a13f_bf257f49623b.slice/crio-4641408569ff9cb5e5983cd6f91a95188be51f7968448f35e816db817b4d17bf WatchSource:0}: Error finding container 4641408569ff9cb5e5983cd6f91a95188be51f7968448f35e816db817b4d17bf: Status 404 returned error can't find the container with id 4641408569ff9cb5e5983cd6f91a95188be51f7968448f35e816db817b4d17bf Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.197096 4742 patch_prober.go:28] interesting pod/router-default-5444994796-hwx7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 11:15:48 crc kubenswrapper[4742]: [-]has-synced failed: reason withheld Mar 17 11:15:48 crc kubenswrapper[4742]: [+]process-running ok Mar 17 11:15:48 crc kubenswrapper[4742]: healthz check failed Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.197139 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hwx7f" podUID="2800a131-02e6-49f1-9385-6065c4b4216e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.222725 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:48 crc kubenswrapper[4742]: E0317 11:15:48.223057 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:48.723042675 +0000 UTC m=+251.849170433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.236243 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4gd2t" podStartSLOduration=184.236226675 podStartE2EDuration="3m4.236226675s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:48.177564131 +0000 UTC m=+251.303691889" watchObservedRunningTime="2026-03-17 11:15:48.236226675 +0000 UTC m=+251.362354423" Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.245775 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.319753 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8pwp5" podStartSLOduration=185.319735496 podStartE2EDuration="3m5.319735496s" podCreationTimestamp="2026-03-17 11:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:48.318946973 +0000 UTC m=+251.445074731" watchObservedRunningTime="2026-03-17 11:15:48.319735496 +0000 UTC m=+251.445863254" Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.326124 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:48 crc kubenswrapper[4742]: E0317 11:15:48.326433 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:48.82642108 +0000 UTC m=+251.952548838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.428233 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:48 crc kubenswrapper[4742]: E0317 11:15:48.428636 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:48.92861892 +0000 UTC m=+252.054746678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.530086 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:48 crc kubenswrapper[4742]: E0317 11:15:48.530582 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:49.030570683 +0000 UTC m=+252.156698441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.632695 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:48 crc kubenswrapper[4742]: E0317 11:15:48.632880 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:49.132850666 +0000 UTC m=+252.258978424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.633400 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:48 crc kubenswrapper[4742]: E0317 11:15:48.633833 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:49.133820375 +0000 UTC m=+252.259948133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.675677 4742 patch_prober.go:28] interesting pod/console-operator-58897d9998-tmn6g container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.675724 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tmn6g" podUID="fa95c069-97da-45bf-ac92-c80160bd8648" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.705792 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w4g9d" event={"ID":"b007618f-b075-4f57-85e0-5fa8e89bc1bb","Type":"ContainerStarted","Data":"6c76b9afd2e54c4cc3f053b867bbcd9e2cde9d508845568c29b4caaa741e441e"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.715733 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" event={"ID":"aec28323-bc80-4975-a4c8-c3bd9a05c356","Type":"ContainerStarted","Data":"5b7e9b61caf478ef7e1bd97307c775d45e496b7d9d827e360d86b9b9e6dc2c94"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.715775 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" event={"ID":"aec28323-bc80-4975-a4c8-c3bd9a05c356","Type":"ContainerStarted","Data":"b7c3f23745bc1fdec016dec1f2ba289990640b0212902ddc71672e0cad7f9089"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.716680 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.723845 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" event={"ID":"a9fc09c5-43ef-4abe-8e2f-04221dad03c0","Type":"ContainerStarted","Data":"88dfa8dfd7f743578b1f869e5571a8ae10d9bf77d3cf8416a3aaac3b008332c7"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.733425 4742 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-gm8lg container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.733513 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" podUID="aec28323-bc80-4975-a4c8-c3bd9a05c356" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.735809 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:48 crc kubenswrapper[4742]: E0317 11:15:48.736360 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:49.236346175 +0000 UTC m=+252.362473933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.743489 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" event={"ID":"74651634-b893-441a-9e3c-18a8eaeafcfa","Type":"ContainerStarted","Data":"4b3cea39c7d693e2f04ece0146c3e387a5c9c7d904a1a60002e3663428ebd477"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.766061 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" event={"ID":"ba5a493d-557e-4551-a13f-bf257f49623b","Type":"ContainerStarted","Data":"4641408569ff9cb5e5983cd6f91a95188be51f7968448f35e816db817b4d17bf"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.811366 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-skgwz" event={"ID":"827bcca5-3d25-4f48-bee8-1f012196617b","Type":"ContainerStarted","Data":"d3dfc59abf2365bc2189287310d7f337b459805cb5375271fcd67e3402f87975"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.837576 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:48 crc kubenswrapper[4742]: E0317 11:15:48.839456 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:49.339441502 +0000 UTC m=+252.465569260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.841016 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-52v8r" event={"ID":"50e9e286-63d8-4081-b085-ad6aa123b560","Type":"ContainerStarted","Data":"c3f7f5197cfac12361b15639c373f72a73a505a2c4aaf3de635d34e699b84dd6"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.841052 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-52v8r" event={"ID":"50e9e286-63d8-4081-b085-ad6aa123b560","Type":"ContainerStarted","Data":"456aa1b4f23599a5bb9310dd37e63efa1c6b95a03d0cceb8e50d4b0a2550bf39"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.871430 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-58scf" event={"ID":"08eb3fa4-f651-4ba7-b367-1cc1e684398c","Type":"ContainerStarted","Data":"4ffdc892e1b3bffcd911e3ab8e6cebc12899694d2bd7061088f808ddb3ffd099"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.872838 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" event={"ID":"8a19595a-7833-40fa-a836-f87d3a294f86","Type":"ContainerStarted","Data":"44a58d39808fccf24153e37ea8a43e445558806ae25c225e3891928b33b01a63"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.874445 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hg7ln" event={"ID":"d5eebf39-d75b-460d-9d32-de0eca5b904d","Type":"ContainerStarted","Data":"df0a6e9204f65c0331fc0cb06f13db5eb86f5b7d555044b0402d97b04ce54a54"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.874469 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hg7ln" event={"ID":"d5eebf39-d75b-460d-9d32-de0eca5b904d","Type":"ContainerStarted","Data":"da793fabad8da037c3a960e3304618d4403728e24f6f352083c8d0146ffcc48d"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.942168 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" event={"ID":"949d92a4-d000-43fe-91b2-c12bc9c86251","Type":"ContainerStarted","Data":"a549374696c0509ded6323898fc25b7133b1992dfd08775d63582a70965e82ae"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.942579 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:48 crc kubenswrapper[4742]: E0317 11:15:48.942807 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:49.442786276 +0000 UTC m=+252.568914034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.942965 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:48 crc kubenswrapper[4742]: E0317 11:15:48.944201 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:49.444186526 +0000 UTC m=+252.570314284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.984979 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" event={"ID":"df58c683-d42a-46c4-9e5e-9b717ddc7956","Type":"ContainerStarted","Data":"679f637414d53fd9be6049421eaf4fcdc9978f695f8a45479c902613a6bccceb"} Mar 17 11:15:48 crc kubenswrapper[4742]: I0317 11:15:48.985067 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.018345 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" event={"ID":"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f","Type":"ContainerStarted","Data":"0b68bba86288530b43908ca6af48119090dfca5430cdcfd226a3ee39a745845c"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.018385 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" event={"ID":"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f","Type":"ContainerStarted","Data":"454a1ff6bccda93f115c3d5475351bf7a8d6526697a44a1850a0adc5d03f115e"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.046313 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:49 crc kubenswrapper[4742]: E0317 11:15:49.048104 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:49.548074896 +0000 UTC m=+252.674202654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.065662 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9fkmh" event={"ID":"361582e0-97ed-4927-b83f-642592572dac","Type":"ContainerStarted","Data":"94b64196606953b0b326da1190e28acf318ec5deea11f215c2164bb9e390a049"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.074037 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" event={"ID":"1c60b7f0-a64f-4968-a678-66b6cd89dc97","Type":"ContainerStarted","Data":"48563c7335ab41a2654a65fff0e1df8738425aa2d7c508fd4bf7f06bf91eea14"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.074074 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" event={"ID":"1c60b7f0-a64f-4968-a678-66b6cd89dc97","Type":"ContainerStarted","Data":"0e5d76a0d3fdc9cd4594df61a2f27a78bf729a636ddadc8944e74be1053cb085"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.095821 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8" event={"ID":"957049a3-8921-4ec9-a66c-d0fe15848fad","Type":"ContainerStarted","Data":"321e212b31df22df5f533216e5e9b5b7b2ec519e4a4d9011baaaacb11286393f"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.096463 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8" event={"ID":"957049a3-8921-4ec9-a66c-d0fe15848fad","Type":"ContainerStarted","Data":"ecf20f5d897f977b60fe3899456b9529be6128eb0adcef0c78db67382987d441"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.123397 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" event={"ID":"75629956-e407-4638-90cd-fd2f907bb0fb","Type":"ContainerStarted","Data":"300d665f9fa4f8318b1145926cca19ef60266854c5ffcc0a3bc845995ee1c214"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.123438 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" event={"ID":"75629956-e407-4638-90cd-fd2f907bb0fb","Type":"ContainerStarted","Data":"393835b451ec6f783a5019e97a9bb84d3b467277b57706160731fa596e7c0507"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.148019 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" event={"ID":"df1d9cfc-5349-44ec-bed8-ac71aa7741d1","Type":"ContainerStarted","Data":"e33700dc4bd7057d3f3e231328511f5b950549fef39e4c2fac4b673baaf018e3"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.148952 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:49 crc kubenswrapper[4742]: E0317 11:15:49.150864 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:49.650852963 +0000 UTC m=+252.776980721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.164918 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" event={"ID":"5a8987e2-189e-4b66-9908-4bccb83f07a6","Type":"ContainerStarted","Data":"fa5c7d846c36fe8a42d95e83d7386685c15f0b14bbcea9d9611a80a8e4d55650"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.193891 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9zclv" event={"ID":"e415e748-23a1-4fdd-80ba-38308aaa4926","Type":"ContainerStarted","Data":"0d8a3db151404cb6f65b41da4f04f8bfb04ba63c9a05bfd20095b9dfd79a0ef4"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.198718 4742 patch_prober.go:28] interesting pod/router-default-5444994796-hwx7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 11:15:49 crc kubenswrapper[4742]: [-]has-synced failed: reason withheld Mar 17 11:15:49 crc kubenswrapper[4742]: [+]process-running ok Mar 17 11:15:49 crc kubenswrapper[4742]: healthz check failed Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.198768 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hwx7f" podUID="2800a131-02e6-49f1-9385-6065c4b4216e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.216356 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" event={"ID":"a7bb0b6b-532a-4492-9fa3-c24db5074886","Type":"ContainerStarted","Data":"0d9954c31a6b0d4af7d8d9b23527c3e5c3be9228fa0d60e3dcb0b35f9a074057"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.227867 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xxd4" event={"ID":"efd3a468-c0d1-4736-8d34-35448326ade8","Type":"ContainerStarted","Data":"8fc5ea1d83b6433c731e6f1346b130138c2d2a3548a6c68c1d38401671548f7a"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.236385 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" event={"ID":"b4ce74de-8d87-4ad4-9a30-d96f45ac21b5","Type":"ContainerStarted","Data":"a274e9bcb14adb8d8ce1d64cb886a742004706cbf335be852def2736853ae1da"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.250644 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" event={"ID":"61b81f5a-30d8-4c88-899b-5effb490bdee","Type":"ContainerStarted","Data":"394aa3c16e64ab8ab054db1eea6d6fdde6e1172c949c1fcb9ef49e6d47f51abd"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.251708 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.252211 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:49 crc kubenswrapper[4742]: E0317 11:15:49.253538 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:49.753523098 +0000 UTC m=+252.879650856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.263087 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" event={"ID":"3c93c286-d743-4512-a19b-baf28a72cd77","Type":"ContainerStarted","Data":"c6ae0d7745c8fce23192b7c054b17e9e853d0b18998ff1dfbc716acad1b3875d"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.264023 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.268484 4742 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2ftt5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.268521 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" podUID="3c93c286-d743-4512-a19b-baf28a72cd77" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.294476 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" event={"ID":"45e5ba25-cee1-4d34-8fff-a006d7cbbd5c","Type":"ContainerStarted","Data":"7c36e89c77ff3aafd10b068aca3a4c544d267fa9fbf209a9798a1a94d8abe1a8"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.294518 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" event={"ID":"45e5ba25-cee1-4d34-8fff-a006d7cbbd5c","Type":"ContainerStarted","Data":"86a77b5097ed98de8a80d57a1eb16ed383b0a4bdb650b8441d8c6acecb58a431"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.295283 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.308609 4742 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-kq8zp container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.308972 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" podUID="45e5ba25-cee1-4d34-8fff-a006d7cbbd5c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.312492 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562434-wtx87" event={"ID":"5b3a8612-a5db-4ec8-9873-32829e2fe69e","Type":"ContainerStarted","Data":"f3af0eb7fcb7ccb9197f7f8bc761f0c9a1016569c7cbb5432dab238ad6daf9e7"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.344876 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" event={"ID":"76ed03a0-90ee-4e37-9580-d7136a7fdc5e","Type":"ContainerStarted","Data":"57c55a392dcfb6b63f0e2db739d69100cd8392405788e372ea5571545699591e"} Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.346295 4742 patch_prober.go:28] interesting pod/downloads-7954f5f757-s5z9r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.346394 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-s5z9r" podUID="0de428d9-1755-4c28-8c6e-cbb115aef7c7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.346672 4742 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6n4cr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.346774 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" podUID="f33a63f1-688a-46eb-a32f-5259fa969528" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.353938 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:49 crc kubenswrapper[4742]: E0317 11:15:49.355357 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:49.855345037 +0000 UTC m=+252.981472795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.380941 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-tmn6g" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.458340 4742 ???:1] "http: TLS handshake error from 192.168.126.11:37690: no serving certificate available for the kubelet" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.458550 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:49 crc kubenswrapper[4742]: E0317 11:15:49.459933 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:49.959914067 +0000 UTC m=+253.086041825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.461532 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:49 crc kubenswrapper[4742]: E0317 11:15:49.466789 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:49.966775985 +0000 UTC m=+253.092903743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.539539 4742 ???:1] "http: TLS handshake error from 192.168.126.11:37704: no serving certificate available for the kubelet" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.562313 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:49 crc kubenswrapper[4742]: E0317 11:15:49.562450 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:50.062428047 +0000 UTC m=+253.188555805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.562574 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:49 crc kubenswrapper[4742]: E0317 11:15:49.562868 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:50.06286073 +0000 UTC m=+253.188988488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.626750 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" podStartSLOduration=185.626733653 podStartE2EDuration="3m5.626733653s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:49.580517779 +0000 UTC m=+252.706645547" watchObservedRunningTime="2026-03-17 11:15:49.626733653 +0000 UTC m=+252.752861411" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.629062 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" podStartSLOduration=186.629054641 podStartE2EDuration="3m6.629054641s" podCreationTimestamp="2026-03-17 11:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:49.615682184 +0000 UTC m=+252.741809952" watchObservedRunningTime="2026-03-17 11:15:49.629054641 +0000 UTC m=+252.755182399" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.650346 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-2f4j6" podStartSLOduration=185.650332685 podStartE2EDuration="3m5.650332685s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:49.648158413 +0000 UTC m=+252.774286171" watchObservedRunningTime="2026-03-17 11:15:49.650332685 +0000 UTC m=+252.776460443" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.665492 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:49 crc kubenswrapper[4742]: E0317 11:15:49.665882 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:50.165866693 +0000 UTC m=+253.291994451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.688247 4742 ???:1] "http: TLS handshake error from 192.168.126.11:37706: no serving certificate available for the kubelet" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.688579 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" podStartSLOduration=185.688566899 podStartE2EDuration="3m5.688566899s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:49.688430235 +0000 UTC m=+252.814557993" watchObservedRunningTime="2026-03-17 11:15:49.688566899 +0000 UTC m=+252.814694657" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.739876 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-9fkmh" podStartSLOduration=185.73986284 podStartE2EDuration="3m5.73986284s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:49.739074188 +0000 UTC m=+252.865201956" watchObservedRunningTime="2026-03-17 11:15:49.73986284 +0000 UTC m=+252.865990598" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.768168 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:49 crc kubenswrapper[4742]: E0317 11:15:49.768516 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:50.268500167 +0000 UTC m=+253.394627925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.778538 4742 ???:1] "http: TLS handshake error from 192.168.126.11:37716: no serving certificate available for the kubelet" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.804082 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" podStartSLOduration=185.804064624 podStartE2EDuration="3m5.804064624s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:49.802311554 +0000 UTC m=+252.928439312" watchObservedRunningTime="2026-03-17 11:15:49.804064624 +0000 UTC m=+252.930192382" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.812978 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.813226 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.821663 4742 patch_prober.go:28] interesting pod/apiserver-76f77b778f-52v8r container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.821713 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-52v8r" podUID="50e9e286-63d8-4081-b085-ad6aa123b560" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.840209 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zk827"] Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.868670 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:49 crc kubenswrapper[4742]: E0317 11:15:49.869011 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:50.368998198 +0000 UTC m=+253.495125956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.885468 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-52v8r" podStartSLOduration=186.885451114 podStartE2EDuration="3m6.885451114s" podCreationTimestamp="2026-03-17 11:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:49.883633762 +0000 UTC m=+253.009761510" watchObservedRunningTime="2026-03-17 11:15:49.885451114 +0000 UTC m=+253.011578872" Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.935295 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd"] Mar 17 11:15:49 crc kubenswrapper[4742]: I0317 11:15:49.969515 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:49 crc kubenswrapper[4742]: E0317 11:15:49.969796 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:50.469785998 +0000 UTC m=+253.595913756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.003279 4742 ???:1] "http: TLS handshake error from 192.168.126.11:37726: no serving certificate available for the kubelet" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.070926 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:50 crc kubenswrapper[4742]: E0317 11:15:50.071333 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:50.57129768 +0000 UTC m=+253.697425438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.071951 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:50 crc kubenswrapper[4742]: E0317 11:15:50.072468 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:50.572449373 +0000 UTC m=+253.698577131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.173456 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:50 crc kubenswrapper[4742]: E0317 11:15:50.173933 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:50.673893672 +0000 UTC m=+253.800021430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.187213 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" podStartSLOduration=187.187196457 podStartE2EDuration="3m7.187196457s" podCreationTimestamp="2026-03-17 11:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.093346757 +0000 UTC m=+253.219474515" watchObservedRunningTime="2026-03-17 11:15:50.187196457 +0000 UTC m=+253.313324215" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.187881 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9zclv" podStartSLOduration=186.187876436 podStartE2EDuration="3m6.187876436s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.185417505 +0000 UTC m=+253.311545273" watchObservedRunningTime="2026-03-17 11:15:50.187876436 +0000 UTC m=+253.314004194" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.198265 4742 patch_prober.go:28] interesting pod/router-default-5444994796-hwx7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 11:15:50 crc kubenswrapper[4742]: [-]has-synced failed: reason withheld Mar 17 11:15:50 crc kubenswrapper[4742]: [+]process-running ok Mar 17 11:15:50 crc kubenswrapper[4742]: healthz check failed Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.198371 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hwx7f" podUID="2800a131-02e6-49f1-9385-6065c4b4216e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.248073 4742 ???:1] "http: TLS handshake error from 192.168.126.11:37740: no serving certificate available for the kubelet" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.251402 4742 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-spkdx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.251452 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" podUID="61b81f5a-30d8-4c88-899b-5effb490bdee" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.275082 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:50 crc kubenswrapper[4742]: E0317 11:15:50.275438 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:50.775425274 +0000 UTC m=+253.901553032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.376175 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:50 crc kubenswrapper[4742]: E0317 11:15:50.376781 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:50.8767644 +0000 UTC m=+254.002892148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.377983 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" podStartSLOduration=186.377965514 podStartE2EDuration="3m6.377965514s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.298171951 +0000 UTC m=+253.424299709" watchObservedRunningTime="2026-03-17 11:15:50.377965514 +0000 UTC m=+253.504093262" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.393111 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-bc2zs" podStartSLOduration=186.393084861 podStartE2EDuration="3m6.393084861s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.376974786 +0000 UTC m=+253.503102544" watchObservedRunningTime="2026-03-17 11:15:50.393084861 +0000 UTC m=+253.519212619" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.421720 4742 ???:1] "http: TLS handshake error from 192.168.126.11:37746: no serving certificate available for the kubelet" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.434406 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" event={"ID":"ba5a493d-557e-4551-a13f-bf257f49623b","Type":"ContainerStarted","Data":"cb4cbaefd618213551d8e4ea562766b4ec811555a57a8ebd52b0df2c08001b9c"} Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.434453 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" event={"ID":"ba5a493d-557e-4551-a13f-bf257f49623b","Type":"ContainerStarted","Data":"44ea75539e7e7b24a431229e36899df06c200cefdb55e31fe6badfdb6d509047"} Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.449041 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-58scf" event={"ID":"08eb3fa4-f651-4ba7-b367-1cc1e684398c","Type":"ContainerStarted","Data":"fa6ee9a0aa197ce0a069861d8fc0740753cf938bcff2582a322ba9a984d4ebdb"} Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.449085 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-58scf" event={"ID":"08eb3fa4-f651-4ba7-b367-1cc1e684398c","Type":"ContainerStarted","Data":"d09cf31272af49a4cc7a5e30589a3376fd5f6fb89df21f530d594d9000a347f6"} Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.449608 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-58scf" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.467898 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" event={"ID":"8a19595a-7833-40fa-a836-f87d3a294f86","Type":"ContainerStarted","Data":"df22f8861a8a72a7dec9b7a7ca269b56465bd0cc1de9b235907dfa6f95b731e8"} Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.478254 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:50 crc kubenswrapper[4742]: E0317 11:15:50.478690 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:50.978671862 +0000 UTC m=+254.104799620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.504814 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" event={"ID":"382cdae8-2d91-4edf-9ac7-0eeb4fd2b88f","Type":"ContainerStarted","Data":"1d1d3227dd9b2d6a1cd92e47a1517cf27f5f5382a07d6c7e3e2dcfdbae65d80d"} Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.507346 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qtcq5" podStartSLOduration=186.507336889 podStartE2EDuration="3m6.507336889s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.46576671 +0000 UTC m=+253.591894468" watchObservedRunningTime="2026-03-17 11:15:50.507336889 +0000 UTC m=+253.633464647" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.508335 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-z2csl" podStartSLOduration=186.508330368 podStartE2EDuration="3m6.508330368s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.504557159 +0000 UTC m=+253.630684917" watchObservedRunningTime="2026-03-17 11:15:50.508330368 +0000 UTC m=+253.634458116" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.556330 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hg7ln" event={"ID":"d5eebf39-d75b-460d-9d32-de0eca5b904d","Type":"ContainerStarted","Data":"2f6ead7891440fa1799e894e1f9caa1a3795b890f8c30eae68e08f09c74311eb"} Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.561178 4742 ???:1] "http: TLS handshake error from 192.168.126.11:37752: no serving certificate available for the kubelet" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.576404 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nsx27" event={"ID":"df1d9cfc-5349-44ec-bed8-ac71aa7741d1","Type":"ContainerStarted","Data":"28aa625db26297e53c3e450d5d3f12cf17172b977cf8a0bae9b45c389f59f131"} Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.580186 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:50 crc kubenswrapper[4742]: E0317 11:15:50.581260 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:51.081243464 +0000 UTC m=+254.207371222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.593201 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" event={"ID":"74651634-b893-441a-9e3c-18a8eaeafcfa","Type":"ContainerStarted","Data":"d93e3e8fe45537db5a6a9975ac08b83ce5e740342beba91949862a66bcc82347"} Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.607359 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" event={"ID":"a7bb0b6b-532a-4492-9fa3-c24db5074886","Type":"ContainerStarted","Data":"5ad2eade56b47cdc5635bd37634ffdf0e7702759970de50a4aabf8bd2bc349f9"} Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.607424 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" event={"ID":"a7bb0b6b-532a-4492-9fa3-c24db5074886","Type":"ContainerStarted","Data":"bdbf495887a7d00b915741e006533db63c8b31ba11a94b4877b6355ce213ef03"} Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.608290 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.628379 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" event={"ID":"a63c2414-b309-48e5-95f2-ab1b45577b92","Type":"ContainerStarted","Data":"737d9fc938ffb660764e6430a7883e9de7bca2501cd73cedca9d3d98d1672ed4"} Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.658811 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w4g9d" event={"ID":"b007618f-b075-4f57-85e0-5fa8e89bc1bb","Type":"ContainerStarted","Data":"4b339677ae1e7d6e9cf32cd78fc876684d5d96dc1cf42a1ae2b7cdfa481e316f"} Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.662314 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" podStartSLOduration=50.662305034 podStartE2EDuration="50.662305034s" podCreationTimestamp="2026-03-17 11:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.661282444 +0000 UTC m=+253.787410202" watchObservedRunningTime="2026-03-17 11:15:50.662305034 +0000 UTC m=+253.788432792" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.663480 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-t2nj8" podStartSLOduration=186.663476568 podStartE2EDuration="3m6.663476568s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.572559113 +0000 UTC m=+253.698686881" watchObservedRunningTime="2026-03-17 11:15:50.663476568 +0000 UTC m=+253.789604326" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.682005 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:50 crc kubenswrapper[4742]: E0317 11:15:50.699175 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:51.199160068 +0000 UTC m=+254.325287816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.703821 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-zvgjb" podStartSLOduration=186.703800003 podStartE2EDuration="3m6.703800003s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.698777187 +0000 UTC m=+253.824904945" watchObservedRunningTime="2026-03-17 11:15:50.703800003 +0000 UTC m=+253.829927761" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.708645 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kq8zp" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.708692 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2ftt5" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.708716 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8tf9v" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.730960 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wbm7l" podStartSLOduration=186.730945376 podStartE2EDuration="3m6.730945376s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.729264338 +0000 UTC m=+253.855392096" watchObservedRunningTime="2026-03-17 11:15:50.730945376 +0000 UTC m=+253.857073144" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.782233 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hldfg" podStartSLOduration=186.782218437 podStartE2EDuration="3m6.782218437s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.781372642 +0000 UTC m=+253.907500400" watchObservedRunningTime="2026-03-17 11:15:50.782218437 +0000 UTC m=+253.908346195" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.783897 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:50 crc kubenswrapper[4742]: E0317 11:15:50.803796 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:51.303748008 +0000 UTC m=+254.429875966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.819244 4742 ???:1] "http: TLS handshake error from 192.168.126.11:37762: no serving certificate available for the kubelet" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.866219 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzn7v" podStartSLOduration=186.866200712 podStartE2EDuration="3m6.866200712s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.829839292 +0000 UTC m=+253.955967060" watchObservedRunningTime="2026-03-17 11:15:50.866200712 +0000 UTC m=+253.992328470" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.893654 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:50 crc kubenswrapper[4742]: E0317 11:15:50.894430 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:51.394419166 +0000 UTC m=+254.520546924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.916925 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-87n9v" podStartSLOduration=186.916892015 podStartE2EDuration="3m6.916892015s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.901545852 +0000 UTC m=+254.027673620" watchObservedRunningTime="2026-03-17 11:15:50.916892015 +0000 UTC m=+254.043019773" Mar 17 11:15:50 crc kubenswrapper[4742]: I0317 11:15:50.995027 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-hg7ln" podStartSLOduration=186.99499219 podStartE2EDuration="3m6.99499219s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:50.960176445 +0000 UTC m=+254.086304203" watchObservedRunningTime="2026-03-17 11:15:50.99499219 +0000 UTC m=+254.121119948" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:50.997066 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:51 crc kubenswrapper[4742]: E0317 11:15:50.997522 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:51.497503022 +0000 UTC m=+254.623630780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.060453 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" podStartSLOduration=187.060434759 podStartE2EDuration="3m7.060434759s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:51.055675602 +0000 UTC m=+254.181803360" watchObservedRunningTime="2026-03-17 11:15:51.060434759 +0000 UTC m=+254.186562517" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.099184 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:51 crc kubenswrapper[4742]: E0317 11:15:51.099819 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:51.599805306 +0000 UTC m=+254.725933064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.120004 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-58scf" podStartSLOduration=9.112901185 podStartE2EDuration="9.112901185s" podCreationTimestamp="2026-03-17 11:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:51.09159195 +0000 UTC m=+254.217719718" watchObservedRunningTime="2026-03-17 11:15:51.112901185 +0000 UTC m=+254.239050173" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.166452 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cnzgk"] Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.167490 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cnzgk"] Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.167579 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.187185 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-w4g9d" podStartSLOduration=9.187170709 podStartE2EDuration="9.187170709s" podCreationTimestamp="2026-03-17 11:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:51.186815379 +0000 UTC m=+254.312943137" watchObservedRunningTime="2026-03-17 11:15:51.187170709 +0000 UTC m=+254.313298467" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.197952 4742 patch_prober.go:28] interesting pod/router-default-5444994796-hwx7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 11:15:51 crc kubenswrapper[4742]: [-]has-synced failed: reason withheld Mar 17 11:15:51 crc kubenswrapper[4742]: [+]process-running ok Mar 17 11:15:51 crc kubenswrapper[4742]: healthz check failed Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.198008 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hwx7f" podUID="2800a131-02e6-49f1-9385-6065c4b4216e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.202593 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.203485 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:51 crc kubenswrapper[4742]: E0317 11:15:51.203970 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:51.703945594 +0000 UTC m=+254.830073352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.260648 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" podStartSLOduration=187.26063054 podStartE2EDuration="3m7.26063054s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:51.220837541 +0000 UTC m=+254.346965299" watchObservedRunningTime="2026-03-17 11:15:51.26063054 +0000 UTC m=+254.386758298" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.304976 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b80c435-0e24-4ab2-980c-f2dfb1baef87-utilities\") pod \"certified-operators-cnzgk\" (UID: \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\") " pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.305286 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.305312 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w5nm\" (UniqueName: \"kubernetes.io/projected/4b80c435-0e24-4ab2-980c-f2dfb1baef87-kube-api-access-7w5nm\") pod \"certified-operators-cnzgk\" (UID: \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\") " pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.305389 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b80c435-0e24-4ab2-980c-f2dfb1baef87-catalog-content\") pod \"certified-operators-cnzgk\" (UID: \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\") " pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:15:51 crc kubenswrapper[4742]: E0317 11:15:51.305660 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:51.805649559 +0000 UTC m=+254.931777317 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.330998 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5v4hw"] Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.332986 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.336126 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.343374 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5v4hw"] Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.407164 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:51 crc kubenswrapper[4742]: E0317 11:15:51.407327 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:51.907285524 +0000 UTC m=+255.033413272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.407664 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b80c435-0e24-4ab2-980c-f2dfb1baef87-utilities\") pod \"certified-operators-cnzgk\" (UID: \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\") " pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.407763 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.407844 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w5nm\" (UniqueName: \"kubernetes.io/projected/4b80c435-0e24-4ab2-980c-f2dfb1baef87-kube-api-access-7w5nm\") pod \"certified-operators-cnzgk\" (UID: \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\") " pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.408013 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b80c435-0e24-4ab2-980c-f2dfb1baef87-catalog-content\") pod \"certified-operators-cnzgk\" (UID: \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\") " pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:15:51 crc kubenswrapper[4742]: E0317 11:15:51.408035 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:51.908026816 +0000 UTC m=+255.034154574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.408686 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b80c435-0e24-4ab2-980c-f2dfb1baef87-catalog-content\") pod \"certified-operators-cnzgk\" (UID: \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\") " pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.408969 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b80c435-0e24-4ab2-980c-f2dfb1baef87-utilities\") pod \"certified-operators-cnzgk\" (UID: \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\") " pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.434783 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gm8lg" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.443033 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w5nm\" (UniqueName: \"kubernetes.io/projected/4b80c435-0e24-4ab2-980c-f2dfb1baef87-kube-api-access-7w5nm\") pod \"certified-operators-cnzgk\" (UID: \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\") " pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.520380 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:51 crc kubenswrapper[4742]: E0317 11:15:51.520533 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:52.020509604 +0000 UTC m=+255.146637362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.520593 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e6f877-4431-46ba-8c22-0479a383851b-catalog-content\") pod \"community-operators-5v4hw\" (UID: \"72e6f877-4431-46ba-8c22-0479a383851b\") " pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.520666 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.520687 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e6f877-4431-46ba-8c22-0479a383851b-utilities\") pod \"community-operators-5v4hw\" (UID: \"72e6f877-4431-46ba-8c22-0479a383851b\") " pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.520707 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5sxs\" (UniqueName: \"kubernetes.io/projected/72e6f877-4431-46ba-8c22-0479a383851b-kube-api-access-w5sxs\") pod \"community-operators-5v4hw\" (UID: \"72e6f877-4431-46ba-8c22-0479a383851b\") " pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.521221 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:15:51 crc kubenswrapper[4742]: E0317 11:15:51.521753 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:52.021743599 +0000 UTC m=+255.147871357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.543399 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qzdpj"] Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.544258 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.570692 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzdpj"] Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.605209 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.622008 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.622185 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e6f877-4431-46ba-8c22-0479a383851b-utilities\") pod \"community-operators-5v4hw\" (UID: \"72e6f877-4431-46ba-8c22-0479a383851b\") " pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.622212 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5sxs\" (UniqueName: \"kubernetes.io/projected/72e6f877-4431-46ba-8c22-0479a383851b-kube-api-access-w5sxs\") pod \"community-operators-5v4hw\" (UID: \"72e6f877-4431-46ba-8c22-0479a383851b\") " pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.622268 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e6f877-4431-46ba-8c22-0479a383851b-catalog-content\") pod \"community-operators-5v4hw\" (UID: \"72e6f877-4431-46ba-8c22-0479a383851b\") " pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.622645 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e6f877-4431-46ba-8c22-0479a383851b-catalog-content\") pod \"community-operators-5v4hw\" (UID: \"72e6f877-4431-46ba-8c22-0479a383851b\") " pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:15:51 crc kubenswrapper[4742]: E0317 11:15:51.622708 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:52.122692804 +0000 UTC m=+255.248820562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.622921 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e6f877-4431-46ba-8c22-0479a383851b-utilities\") pod \"community-operators-5v4hw\" (UID: \"72e6f877-4431-46ba-8c22-0479a383851b\") " pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.671785 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" event={"ID":"74651634-b893-441a-9e3c-18a8eaeafcfa","Type":"ContainerStarted","Data":"57858c8ac07eee550bb43947789971f124b9f3eb7efcaed4052be4ebfea363d9"} Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.672369 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" podUID="5ee15c68-88ae-4ca8-b3d5-94266082d7ba" containerName="route-controller-manager" containerID="cri-o://2a3a9577fe0d9db7d18b3b1da9c2681043887f5c0d1c2372f97e42d62b15ea57" gracePeriod=30 Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.674501 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" podUID="497b1f19-025b-4b65-b062-b4a94eec3cfc" containerName="controller-manager" containerID="cri-o://0ed2836e2808da8b27ae0271a65ffa592068c830ec52870952512e1c220b8392" gracePeriod=30 Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.696979 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5sxs\" (UniqueName: \"kubernetes.io/projected/72e6f877-4431-46ba-8c22-0479a383851b-kube-api-access-w5sxs\") pod \"community-operators-5v4hw\" (UID: \"72e6f877-4431-46ba-8c22-0479a383851b\") " pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.723180 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.723241 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8e4be20-e918-45d7-b026-12ef2abf3462-catalog-content\") pod \"certified-operators-qzdpj\" (UID: \"c8e4be20-e918-45d7-b026-12ef2abf3462\") " pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.723335 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fmck\" (UniqueName: \"kubernetes.io/projected/c8e4be20-e918-45d7-b026-12ef2abf3462-kube-api-access-9fmck\") pod \"certified-operators-qzdpj\" (UID: \"c8e4be20-e918-45d7-b026-12ef2abf3462\") " pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.723357 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8e4be20-e918-45d7-b026-12ef2abf3462-utilities\") pod \"certified-operators-qzdpj\" (UID: \"c8e4be20-e918-45d7-b026-12ef2abf3462\") " pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:15:51 crc kubenswrapper[4742]: E0317 11:15:51.723662 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:52.223650329 +0000 UTC m=+255.349778087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.752276 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x6ffv"] Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.753340 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.764416 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x6ffv"] Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.825740 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:51 crc kubenswrapper[4742]: E0317 11:15:51.827088 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:52.327066525 +0000 UTC m=+255.453194273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.828508 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fmck\" (UniqueName: \"kubernetes.io/projected/c8e4be20-e918-45d7-b026-12ef2abf3462-kube-api-access-9fmck\") pod \"certified-operators-qzdpj\" (UID: \"c8e4be20-e918-45d7-b026-12ef2abf3462\") " pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.839292 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8e4be20-e918-45d7-b026-12ef2abf3462-utilities\") pod \"certified-operators-qzdpj\" (UID: \"c8e4be20-e918-45d7-b026-12ef2abf3462\") " pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.839549 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.839621 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8e4be20-e918-45d7-b026-12ef2abf3462-catalog-content\") pod \"certified-operators-qzdpj\" (UID: \"c8e4be20-e918-45d7-b026-12ef2abf3462\") " pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.840112 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8e4be20-e918-45d7-b026-12ef2abf3462-utilities\") pod \"certified-operators-qzdpj\" (UID: \"c8e4be20-e918-45d7-b026-12ef2abf3462\") " pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:15:51 crc kubenswrapper[4742]: E0317 11:15:51.842335 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:52.342324126 +0000 UTC m=+255.468451884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.847980 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8e4be20-e918-45d7-b026-12ef2abf3462-catalog-content\") pod \"certified-operators-qzdpj\" (UID: \"c8e4be20-e918-45d7-b026-12ef2abf3462\") " pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.860506 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fmck\" (UniqueName: \"kubernetes.io/projected/c8e4be20-e918-45d7-b026-12ef2abf3462-kube-api-access-9fmck\") pod \"certified-operators-qzdpj\" (UID: \"c8e4be20-e918-45d7-b026-12ef2abf3462\") " pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.869970 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.951645 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.952186 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0543787-88e8-463d-b01b-694ecb854bfa-utilities\") pod \"community-operators-x6ffv\" (UID: \"e0543787-88e8-463d-b01b-694ecb854bfa\") " pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.952228 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7nqr\" (UniqueName: \"kubernetes.io/projected/e0543787-88e8-463d-b01b-694ecb854bfa-kube-api-access-p7nqr\") pod \"community-operators-x6ffv\" (UID: \"e0543787-88e8-463d-b01b-694ecb854bfa\") " pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.952248 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0543787-88e8-463d-b01b-694ecb854bfa-catalog-content\") pod \"community-operators-x6ffv\" (UID: \"e0543787-88e8-463d-b01b-694ecb854bfa\") " pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:15:51 crc kubenswrapper[4742]: E0317 11:15:51.952418 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:52.452400223 +0000 UTC m=+255.578527981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:51 crc kubenswrapper[4742]: I0317 11:15:51.956467 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.053353 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0543787-88e8-463d-b01b-694ecb854bfa-utilities\") pod \"community-operators-x6ffv\" (UID: \"e0543787-88e8-463d-b01b-694ecb854bfa\") " pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.053411 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7nqr\" (UniqueName: \"kubernetes.io/projected/e0543787-88e8-463d-b01b-694ecb854bfa-kube-api-access-p7nqr\") pod \"community-operators-x6ffv\" (UID: \"e0543787-88e8-463d-b01b-694ecb854bfa\") " pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.053430 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0543787-88e8-463d-b01b-694ecb854bfa-catalog-content\") pod \"community-operators-x6ffv\" (UID: \"e0543787-88e8-463d-b01b-694ecb854bfa\") " pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.053452 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:52 crc kubenswrapper[4742]: E0317 11:15:52.053737 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:52.55372523 +0000 UTC m=+255.679852988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.054210 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0543787-88e8-463d-b01b-694ecb854bfa-utilities\") pod \"community-operators-x6ffv\" (UID: \"e0543787-88e8-463d-b01b-694ecb854bfa\") " pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.054637 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0543787-88e8-463d-b01b-694ecb854bfa-catalog-content\") pod \"community-operators-x6ffv\" (UID: \"e0543787-88e8-463d-b01b-694ecb854bfa\") " pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.099840 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7nqr\" (UniqueName: \"kubernetes.io/projected/e0543787-88e8-463d-b01b-694ecb854bfa-kube-api-access-p7nqr\") pod \"community-operators-x6ffv\" (UID: \"e0543787-88e8-463d-b01b-694ecb854bfa\") " pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.160726 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:52 crc kubenswrapper[4742]: E0317 11:15:52.161108 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:52.661093479 +0000 UTC m=+255.787221227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.180341 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.180818 4742 ???:1] "http: TLS handshake error from 192.168.126.11:37770: no serving certificate available for the kubelet" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.203119 4742 patch_prober.go:28] interesting pod/router-default-5444994796-hwx7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 11:15:52 crc kubenswrapper[4742]: [-]has-synced failed: reason withheld Mar 17 11:15:52 crc kubenswrapper[4742]: [+]process-running ok Mar 17 11:15:52 crc kubenswrapper[4742]: healthz check failed Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.203193 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hwx7f" podUID="2800a131-02e6-49f1-9385-6065c4b4216e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.207709 4742 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.268307 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:52 crc kubenswrapper[4742]: E0317 11:15:52.268932 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-17 11:15:52.768920522 +0000 UTC m=+255.895048280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-9lz9n" (UID: "6b38516a-3938-421e-9191-03786c23318c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.312843 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cnzgk"] Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.347995 4742 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-17T11:15:52.207739557Z","Handler":null,"Name":""} Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.372249 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:52 crc kubenswrapper[4742]: E0317 11:15:52.372663 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-17 11:15:52.872647407 +0000 UTC m=+255.998775165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.372873 4742 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.372893 4742 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.397647 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.467150 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b7568f775-wqc5s"] Mar 17 11:15:52 crc kubenswrapper[4742]: E0317 11:15:52.467744 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="497b1f19-025b-4b65-b062-b4a94eec3cfc" containerName="controller-manager" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.467758 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="497b1f19-025b-4b65-b062-b4a94eec3cfc" containerName="controller-manager" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.467887 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="497b1f19-025b-4b65-b062-b4a94eec3cfc" containerName="controller-manager" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.468338 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.471219 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b7568f775-wqc5s"] Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.483266 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pk7z\" (UniqueName: \"kubernetes.io/projected/497b1f19-025b-4b65-b062-b4a94eec3cfc-kube-api-access-7pk7z\") pod \"497b1f19-025b-4b65-b062-b4a94eec3cfc\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.483462 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-config\") pod \"497b1f19-025b-4b65-b062-b4a94eec3cfc\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.483506 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-proxy-ca-bundles\") pod \"497b1f19-025b-4b65-b062-b4a94eec3cfc\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.483524 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-client-ca\") pod \"497b1f19-025b-4b65-b062-b4a94eec3cfc\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.483564 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/497b1f19-025b-4b65-b062-b4a94eec3cfc-serving-cert\") pod \"497b1f19-025b-4b65-b062-b4a94eec3cfc\" (UID: \"497b1f19-025b-4b65-b062-b4a94eec3cfc\") " Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.483752 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-client-ca\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.483776 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af781729-fe84-4dcf-afed-cb56769bf4ca-serving-cert\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.483806 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.483826 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-config\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.483863 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-proxy-ca-bundles\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.483896 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqvtk\" (UniqueName: \"kubernetes.io/projected/af781729-fe84-4dcf-afed-cb56769bf4ca-kube-api-access-pqvtk\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.484696 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-config" (OuterVolumeSpecName: "config") pod "497b1f19-025b-4b65-b062-b4a94eec3cfc" (UID: "497b1f19-025b-4b65-b062-b4a94eec3cfc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.485145 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "497b1f19-025b-4b65-b062-b4a94eec3cfc" (UID: "497b1f19-025b-4b65-b062-b4a94eec3cfc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.485200 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-client-ca" (OuterVolumeSpecName: "client-ca") pod "497b1f19-025b-4b65-b062-b4a94eec3cfc" (UID: "497b1f19-025b-4b65-b062-b4a94eec3cfc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.492319 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/497b1f19-025b-4b65-b062-b4a94eec3cfc-kube-api-access-7pk7z" (OuterVolumeSpecName: "kube-api-access-7pk7z") pod "497b1f19-025b-4b65-b062-b4a94eec3cfc" (UID: "497b1f19-025b-4b65-b062-b4a94eec3cfc"). InnerVolumeSpecName "kube-api-access-7pk7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.518079 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/497b1f19-025b-4b65-b062-b4a94eec3cfc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "497b1f19-025b-4b65-b062-b4a94eec3cfc" (UID: "497b1f19-025b-4b65-b062-b4a94eec3cfc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.544209 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x6ffv"] Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.555585 4742 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.555642 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.560417 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.585395 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-serving-cert\") pod \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.585446 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn2r4\" (UniqueName: \"kubernetes.io/projected/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-kube-api-access-xn2r4\") pod \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.585612 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-config\") pod \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.585650 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-client-ca\") pod \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\" (UID: \"5ee15c68-88ae-4ca8-b3d5-94266082d7ba\") " Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.586352 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-config\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.586412 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-proxy-ca-bundles\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.586447 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqvtk\" (UniqueName: \"kubernetes.io/projected/af781729-fe84-4dcf-afed-cb56769bf4ca-kube-api-access-pqvtk\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.588111 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-client-ca\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.588135 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af781729-fe84-4dcf-afed-cb56769bf4ca-serving-cert\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.588171 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pk7z\" (UniqueName: \"kubernetes.io/projected/497b1f19-025b-4b65-b062-b4a94eec3cfc-kube-api-access-7pk7z\") on node \"crc\" DevicePath \"\"" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.588183 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.588192 4742 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.588201 4742 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/497b1f19-025b-4b65-b062-b4a94eec3cfc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.588208 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/497b1f19-025b-4b65-b062-b4a94eec3cfc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.597322 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-client-ca" (OuterVolumeSpecName: "client-ca") pod "5ee15c68-88ae-4ca8-b3d5-94266082d7ba" (UID: "5ee15c68-88ae-4ca8-b3d5-94266082d7ba"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.598302 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-kube-api-access-xn2r4" (OuterVolumeSpecName: "kube-api-access-xn2r4") pod "5ee15c68-88ae-4ca8-b3d5-94266082d7ba" (UID: "5ee15c68-88ae-4ca8-b3d5-94266082d7ba"). InnerVolumeSpecName "kube-api-access-xn2r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.598489 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-config\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.598529 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-proxy-ca-bundles\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.598871 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-client-ca\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.599148 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af781729-fe84-4dcf-afed-cb56769bf4ca-serving-cert\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.606269 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-config" (OuterVolumeSpecName: "config") pod "5ee15c68-88ae-4ca8-b3d5-94266082d7ba" (UID: "5ee15c68-88ae-4ca8-b3d5-94266082d7ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.608893 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-9lz9n\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.619094 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqvtk\" (UniqueName: \"kubernetes.io/projected/af781729-fe84-4dcf-afed-cb56769bf4ca-kube-api-access-pqvtk\") pod \"controller-manager-7b7568f775-wqc5s\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.619483 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5ee15c68-88ae-4ca8-b3d5-94266082d7ba" (UID: "5ee15c68-88ae-4ca8-b3d5-94266082d7ba"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.648419 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.666125 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.687980 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" event={"ID":"74651634-b893-441a-9e3c-18a8eaeafcfa","Type":"ContainerStarted","Data":"5c32f1ed406fab068b93e7a2c938892a34369ec79c3cce6c1e3d4636b2921fc9"} Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.688809 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.689553 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.689656 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn2r4\" (UniqueName: \"kubernetes.io/projected/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-kube-api-access-xn2r4\") on node \"crc\" DevicePath \"\"" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.690289 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.690353 4742 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ee15c68-88ae-4ca8-b3d5-94266082d7ba-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.705146 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6ffv" event={"ID":"e0543787-88e8-463d-b01b-694ecb854bfa","Type":"ContainerStarted","Data":"f8a18257578e5a388701bfa5e3da59d0226c24a9619a3a87268ebc3b8bfd9611"} Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.705531 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.709507 4742 generic.go:334] "Generic (PLEG): container finished" podID="497b1f19-025b-4b65-b062-b4a94eec3cfc" containerID="0ed2836e2808da8b27ae0271a65ffa592068c830ec52870952512e1c220b8392" exitCode=0 Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.709605 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" event={"ID":"497b1f19-025b-4b65-b062-b4a94eec3cfc","Type":"ContainerDied","Data":"0ed2836e2808da8b27ae0271a65ffa592068c830ec52870952512e1c220b8392"} Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.709609 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.709650 4742 scope.go:117] "RemoveContainer" containerID="0ed2836e2808da8b27ae0271a65ffa592068c830ec52870952512e1c220b8392" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.709633 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zk827" event={"ID":"497b1f19-025b-4b65-b062-b4a94eec3cfc","Type":"ContainerDied","Data":"3836086a144949811b4cd5e235fb4e6d99ec743b2b998f2653ee87794f7f3e09"} Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.720448 4742 generic.go:334] "Generic (PLEG): container finished" podID="5ee15c68-88ae-4ca8-b3d5-94266082d7ba" containerID="2a3a9577fe0d9db7d18b3b1da9c2681043887f5c0d1c2372f97e42d62b15ea57" exitCode=0 Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.720532 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.720544 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" event={"ID":"5ee15c68-88ae-4ca8-b3d5-94266082d7ba","Type":"ContainerDied","Data":"2a3a9577fe0d9db7d18b3b1da9c2681043887f5c0d1c2372f97e42d62b15ea57"} Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.720640 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd" event={"ID":"5ee15c68-88ae-4ca8-b3d5-94266082d7ba","Type":"ContainerDied","Data":"8219a95caeb835f4d08cd802c62ee95dbfaae8be8387e93b26b2805f50dcd3c3"} Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.732520 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cnzgk" event={"ID":"4b80c435-0e24-4ab2-980c-f2dfb1baef87","Type":"ContainerStarted","Data":"67fa5be08079fe2895da8132f995273c1c25e0ee686c429150d425a61411791f"} Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.740796 4742 generic.go:334] "Generic (PLEG): container finished" podID="75629956-e407-4638-90cd-fd2f907bb0fb" containerID="300d665f9fa4f8318b1145926cca19ef60266854c5ffcc0a3bc845995ee1c214" exitCode=0 Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.741425 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" event={"ID":"75629956-e407-4638-90cd-fd2f907bb0fb","Type":"ContainerDied","Data":"300d665f9fa4f8318b1145926cca19ef60266854c5ffcc0a3bc845995ee1c214"} Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.756828 4742 scope.go:117] "RemoveContainer" containerID="0ed2836e2808da8b27ae0271a65ffa592068c830ec52870952512e1c220b8392" Mar 17 11:15:52 crc kubenswrapper[4742]: E0317 11:15:52.758210 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed2836e2808da8b27ae0271a65ffa592068c830ec52870952512e1c220b8392\": container with ID starting with 0ed2836e2808da8b27ae0271a65ffa592068c830ec52870952512e1c220b8392 not found: ID does not exist" containerID="0ed2836e2808da8b27ae0271a65ffa592068c830ec52870952512e1c220b8392" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.758247 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed2836e2808da8b27ae0271a65ffa592068c830ec52870952512e1c220b8392"} err="failed to get container status \"0ed2836e2808da8b27ae0271a65ffa592068c830ec52870952512e1c220b8392\": rpc error: code = NotFound desc = could not find container \"0ed2836e2808da8b27ae0271a65ffa592068c830ec52870952512e1c220b8392\": container with ID starting with 0ed2836e2808da8b27ae0271a65ffa592068c830ec52870952512e1c220b8392 not found: ID does not exist" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.758268 4742 scope.go:117] "RemoveContainer" containerID="2a3a9577fe0d9db7d18b3b1da9c2681043887f5c0d1c2372f97e42d62b15ea57" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.795772 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zk827"] Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.811098 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zk827"] Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.835581 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd"] Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.863134 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4msdd"] Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.866163 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzdpj"] Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.871632 4742 scope.go:117] "RemoveContainer" containerID="2a3a9577fe0d9db7d18b3b1da9c2681043887f5c0d1c2372f97e42d62b15ea57" Mar 17 11:15:52 crc kubenswrapper[4742]: E0317 11:15:52.872406 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a3a9577fe0d9db7d18b3b1da9c2681043887f5c0d1c2372f97e42d62b15ea57\": container with ID starting with 2a3a9577fe0d9db7d18b3b1da9c2681043887f5c0d1c2372f97e42d62b15ea57 not found: ID does not exist" containerID="2a3a9577fe0d9db7d18b3b1da9c2681043887f5c0d1c2372f97e42d62b15ea57" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.872438 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a3a9577fe0d9db7d18b3b1da9c2681043887f5c0d1c2372f97e42d62b15ea57"} err="failed to get container status \"2a3a9577fe0d9db7d18b3b1da9c2681043887f5c0d1c2372f97e42d62b15ea57\": rpc error: code = NotFound desc = could not find container \"2a3a9577fe0d9db7d18b3b1da9c2681043887f5c0d1c2372f97e42d62b15ea57\": container with ID starting with 2a3a9577fe0d9db7d18b3b1da9c2681043887f5c0d1c2372f97e42d62b15ea57 not found: ID does not exist" Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.874104 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5v4hw"] Mar 17 11:15:52 crc kubenswrapper[4742]: I0317 11:15:52.979076 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b7568f775-wqc5s"] Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.005530 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9lz9n"] Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.192201 4742 patch_prober.go:28] interesting pod/router-default-5444994796-hwx7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 11:15:53 crc kubenswrapper[4742]: [-]has-synced failed: reason withheld Mar 17 11:15:53 crc kubenswrapper[4742]: [+]process-running ok Mar 17 11:15:53 crc kubenswrapper[4742]: healthz check failed Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.192534 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hwx7f" podUID="2800a131-02e6-49f1-9385-6065c4b4216e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.320825 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p27vh"] Mar 17 11:15:53 crc kubenswrapper[4742]: E0317 11:15:53.321038 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee15c68-88ae-4ca8-b3d5-94266082d7ba" containerName="route-controller-manager" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.321052 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee15c68-88ae-4ca8-b3d5-94266082d7ba" containerName="route-controller-manager" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.321143 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee15c68-88ae-4ca8-b3d5-94266082d7ba" containerName="route-controller-manager" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.321768 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.324158 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.335543 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p27vh"] Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.406390 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-catalog-content\") pod \"redhat-marketplace-p27vh\" (UID: \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\") " pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.406471 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkglr\" (UniqueName: \"kubernetes.io/projected/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-kube-api-access-tkglr\") pod \"redhat-marketplace-p27vh\" (UID: \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\") " pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.406507 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-utilities\") pod \"redhat-marketplace-p27vh\" (UID: \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\") " pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.507612 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-utilities\") pod \"redhat-marketplace-p27vh\" (UID: \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\") " pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.507725 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-catalog-content\") pod \"redhat-marketplace-p27vh\" (UID: \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\") " pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.507764 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkglr\" (UniqueName: \"kubernetes.io/projected/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-kube-api-access-tkglr\") pod \"redhat-marketplace-p27vh\" (UID: \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\") " pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.508172 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-utilities\") pod \"redhat-marketplace-p27vh\" (UID: \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\") " pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.508214 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-catalog-content\") pod \"redhat-marketplace-p27vh\" (UID: \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\") " pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.526671 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkglr\" (UniqueName: \"kubernetes.io/projected/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-kube-api-access-tkglr\") pod \"redhat-marketplace-p27vh\" (UID: \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\") " pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.634256 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.722657 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6mvgb"] Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.724044 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.737203 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mvgb"] Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.753288 4742 generic.go:334] "Generic (PLEG): container finished" podID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" containerID="04ed2a1da8652b1cd0b35faff8375071651dedb4b2ab99e74e891f2f33e13e58" exitCode=0 Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.753515 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cnzgk" event={"ID":"4b80c435-0e24-4ab2-980c-f2dfb1baef87","Type":"ContainerDied","Data":"04ed2a1da8652b1cd0b35faff8375071651dedb4b2ab99e74e891f2f33e13e58"} Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.767163 4742 generic.go:334] "Generic (PLEG): container finished" podID="c8e4be20-e918-45d7-b026-12ef2abf3462" containerID="bb65bae714172788645ff2c5182bf02c58aab2c62900ead05c1f40fa4a83df45" exitCode=0 Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.767259 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzdpj" event={"ID":"c8e4be20-e918-45d7-b026-12ef2abf3462","Type":"ContainerDied","Data":"bb65bae714172788645ff2c5182bf02c58aab2c62900ead05c1f40fa4a83df45"} Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.767290 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzdpj" event={"ID":"c8e4be20-e918-45d7-b026-12ef2abf3462","Type":"ContainerStarted","Data":"133ee933f61944ecf40afae60eaafcd8d818ce9cd470fcf65c90b3c6136a2835"} Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.770791 4742 generic.go:334] "Generic (PLEG): container finished" podID="72e6f877-4431-46ba-8c22-0479a383851b" containerID="5e8f9625086cd06185e0d36a1c5105d424adb3367bcaefc6ec9d6c22f6d06951" exitCode=0 Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.770837 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v4hw" event={"ID":"72e6f877-4431-46ba-8c22-0479a383851b","Type":"ContainerDied","Data":"5e8f9625086cd06185e0d36a1c5105d424adb3367bcaefc6ec9d6c22f6d06951"} Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.770857 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v4hw" event={"ID":"72e6f877-4431-46ba-8c22-0479a383851b","Type":"ContainerStarted","Data":"1d4b2caa7f8e1ad04201a863cab02f5690154c0e24d3fea9a1769cd7c06a56ce"} Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.789735 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" event={"ID":"6b38516a-3938-421e-9191-03786c23318c","Type":"ContainerStarted","Data":"0e55ea87007e27cbabd32b962fba400b272f4d7478a340f5715c6574942a8890"} Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.789790 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" event={"ID":"6b38516a-3938-421e-9191-03786c23318c","Type":"ContainerStarted","Data":"8dc68984417269e3c1b938664797e8757d61c155cf838069ac66a2e6a5dc48bf"} Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.790612 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.790636 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.791222 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.794643 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.794937 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.805861 4742 generic.go:334] "Generic (PLEG): container finished" podID="e0543787-88e8-463d-b01b-694ecb854bfa" containerID="ebf507ac9ceaca14caee44b4fa7a51a395a9d11e4dd52e60815e1633cbdf3fe8" exitCode=0 Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.805948 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6ffv" event={"ID":"e0543787-88e8-463d-b01b-694ecb854bfa","Type":"ContainerDied","Data":"ebf507ac9ceaca14caee44b4fa7a51a395a9d11e4dd52e60815e1633cbdf3fe8"} Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.811330 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/067d9a25-2589-43d1-ac71-b7fbe6416a31-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"067d9a25-2589-43d1-ac71-b7fbe6416a31\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.811360 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9xsm\" (UniqueName: \"kubernetes.io/projected/d416e1fd-2137-48a4-b933-b25a9ca94a8a-kube-api-access-s9xsm\") pod \"redhat-marketplace-6mvgb\" (UID: \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\") " pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.811385 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d416e1fd-2137-48a4-b933-b25a9ca94a8a-catalog-content\") pod \"redhat-marketplace-6mvgb\" (UID: \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\") " pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.811469 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d416e1fd-2137-48a4-b933-b25a9ca94a8a-utilities\") pod \"redhat-marketplace-6mvgb\" (UID: \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\") " pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.811549 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/067d9a25-2589-43d1-ac71-b7fbe6416a31-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"067d9a25-2589-43d1-ac71-b7fbe6416a31\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.813456 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.819993 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" event={"ID":"74651634-b893-441a-9e3c-18a8eaeafcfa","Type":"ContainerStarted","Data":"5bc986b34ab06d8b3a551e614dab14437f0f929fd714425935b900d772442527"} Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.824119 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" event={"ID":"af781729-fe84-4dcf-afed-cb56769bf4ca","Type":"ContainerStarted","Data":"68e472ab1a0d17f93ce017abb5eef75c7bb0ec16947bc450b6597f73972f53ef"} Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.824152 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.824162 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" event={"ID":"af781729-fe84-4dcf-afed-cb56769bf4ca","Type":"ContainerStarted","Data":"d525baf062858770152386a085e86efda2240d690c2c7748b457a439e06b8c45"} Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.825790 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" podStartSLOduration=189.825781634 podStartE2EDuration="3m9.825781634s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:53.825442435 +0000 UTC m=+256.951570193" watchObservedRunningTime="2026-03-17 11:15:53.825781634 +0000 UTC m=+256.951909392" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.850312 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.862753 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p27vh"] Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.865541 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" podStartSLOduration=3.865521292 podStartE2EDuration="3.865521292s" podCreationTimestamp="2026-03-17 11:15:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:53.856400688 +0000 UTC m=+256.982528446" watchObservedRunningTime="2026-03-17 11:15:53.865521292 +0000 UTC m=+256.991649050" Mar 17 11:15:53 crc kubenswrapper[4742]: W0317 11:15:53.889759 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce3a51df_d6e4_46d5_95e3_8be6aaba196f.slice/crio-970e14da973914f33acd8fc38cc0abd459e88f062e298899a6e3945894a002c7 WatchSource:0}: Error finding container 970e14da973914f33acd8fc38cc0abd459e88f062e298899a6e3945894a002c7: Status 404 returned error can't find the container with id 970e14da973914f33acd8fc38cc0abd459e88f062e298899a6e3945894a002c7 Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.891110 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-cpjwx" podStartSLOduration=11.89109191 podStartE2EDuration="11.89109191s" podCreationTimestamp="2026-03-17 11:15:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:53.88139365 +0000 UTC m=+257.007521408" watchObservedRunningTime="2026-03-17 11:15:53.89109191 +0000 UTC m=+257.017219678" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.937218 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d416e1fd-2137-48a4-b933-b25a9ca94a8a-utilities\") pod \"redhat-marketplace-6mvgb\" (UID: \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\") " pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.937400 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/067d9a25-2589-43d1-ac71-b7fbe6416a31-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"067d9a25-2589-43d1-ac71-b7fbe6416a31\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.937470 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/067d9a25-2589-43d1-ac71-b7fbe6416a31-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"067d9a25-2589-43d1-ac71-b7fbe6416a31\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.937502 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9xsm\" (UniqueName: \"kubernetes.io/projected/d416e1fd-2137-48a4-b933-b25a9ca94a8a-kube-api-access-s9xsm\") pod \"redhat-marketplace-6mvgb\" (UID: \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\") " pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.937533 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d416e1fd-2137-48a4-b933-b25a9ca94a8a-catalog-content\") pod \"redhat-marketplace-6mvgb\" (UID: \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\") " pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.938191 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/067d9a25-2589-43d1-ac71-b7fbe6416a31-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"067d9a25-2589-43d1-ac71-b7fbe6416a31\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.938497 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d416e1fd-2137-48a4-b933-b25a9ca94a8a-catalog-content\") pod \"redhat-marketplace-6mvgb\" (UID: \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\") " pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.938611 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d416e1fd-2137-48a4-b933-b25a9ca94a8a-utilities\") pod \"redhat-marketplace-6mvgb\" (UID: \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\") " pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.960900 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/067d9a25-2589-43d1-ac71-b7fbe6416a31-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"067d9a25-2589-43d1-ac71-b7fbe6416a31\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 11:15:53 crc kubenswrapper[4742]: I0317 11:15:53.961486 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9xsm\" (UniqueName: \"kubernetes.io/projected/d416e1fd-2137-48a4-b933-b25a9ca94a8a-kube-api-access-s9xsm\") pod \"redhat-marketplace-6mvgb\" (UID: \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\") " pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.002066 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.002732 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.005457 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.005688 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.019733 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.036121 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.039190 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1a8e201-d6d6-49f0-885e-615ceffeae2f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e1a8e201-d6d6-49f0-885e-615ceffeae2f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.039255 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1a8e201-d6d6-49f0-885e-615ceffeae2f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e1a8e201-d6d6-49f0-885e-615ceffeae2f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.044608 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.123677 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.140586 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75629956-e407-4638-90cd-fd2f907bb0fb-secret-volume\") pod \"75629956-e407-4638-90cd-fd2f907bb0fb\" (UID: \"75629956-e407-4638-90cd-fd2f907bb0fb\") " Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.140630 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbt7b\" (UniqueName: \"kubernetes.io/projected/75629956-e407-4638-90cd-fd2f907bb0fb-kube-api-access-zbt7b\") pod \"75629956-e407-4638-90cd-fd2f907bb0fb\" (UID: \"75629956-e407-4638-90cd-fd2f907bb0fb\") " Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.140729 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75629956-e407-4638-90cd-fd2f907bb0fb-config-volume\") pod \"75629956-e407-4638-90cd-fd2f907bb0fb\" (UID: \"75629956-e407-4638-90cd-fd2f907bb0fb\") " Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.140955 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1a8e201-d6d6-49f0-885e-615ceffeae2f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e1a8e201-d6d6-49f0-885e-615ceffeae2f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.141007 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1a8e201-d6d6-49f0-885e-615ceffeae2f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e1a8e201-d6d6-49f0-885e-615ceffeae2f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.144654 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1a8e201-d6d6-49f0-885e-615ceffeae2f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e1a8e201-d6d6-49f0-885e-615ceffeae2f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.146550 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75629956-e407-4638-90cd-fd2f907bb0fb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "75629956-e407-4638-90cd-fd2f907bb0fb" (UID: "75629956-e407-4638-90cd-fd2f907bb0fb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.148416 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75629956-e407-4638-90cd-fd2f907bb0fb-kube-api-access-zbt7b" (OuterVolumeSpecName: "kube-api-access-zbt7b") pod "75629956-e407-4638-90cd-fd2f907bb0fb" (UID: "75629956-e407-4638-90cd-fd2f907bb0fb"). InnerVolumeSpecName "kube-api-access-zbt7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.148460 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75629956-e407-4638-90cd-fd2f907bb0fb-config-volume" (OuterVolumeSpecName: "config-volume") pod "75629956-e407-4638-90cd-fd2f907bb0fb" (UID: "75629956-e407-4638-90cd-fd2f907bb0fb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.174395 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1a8e201-d6d6-49f0-885e-615ceffeae2f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e1a8e201-d6d6-49f0-885e-615ceffeae2f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.186257 4742 patch_prober.go:28] interesting pod/router-default-5444994796-hwx7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 11:15:54 crc kubenswrapper[4742]: [-]has-synced failed: reason withheld Mar 17 11:15:54 crc kubenswrapper[4742]: [+]process-running ok Mar 17 11:15:54 crc kubenswrapper[4742]: healthz check failed Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.186306 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hwx7f" podUID="2800a131-02e6-49f1-9385-6065c4b4216e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.242146 4742 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75629956-e407-4638-90cd-fd2f907bb0fb-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.242215 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbt7b\" (UniqueName: \"kubernetes.io/projected/75629956-e407-4638-90cd-fd2f907bb0fb-kube-api-access-zbt7b\") on node \"crc\" DevicePath \"\"" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.242229 4742 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75629956-e407-4638-90cd-fd2f907bb0fb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.273117 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mvgb"] Mar 17 11:15:54 crc kubenswrapper[4742]: W0317 11:15:54.282668 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd416e1fd_2137_48a4_b933_b25a9ca94a8a.slice/crio-782e31003da6e9c2ce12c25c7cfdb411809080f461c17ec85b30bd2d0a523261 WatchSource:0}: Error finding container 782e31003da6e9c2ce12c25c7cfdb411809080f461c17ec85b30bd2d0a523261: Status 404 returned error can't find the container with id 782e31003da6e9c2ce12c25c7cfdb411809080f461c17ec85b30bd2d0a523261 Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.321408 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-47pqd"] Mar 17 11:15:54 crc kubenswrapper[4742]: E0317 11:15:54.321610 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75629956-e407-4638-90cd-fd2f907bb0fb" containerName="collect-profiles" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.321622 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="75629956-e407-4638-90cd-fd2f907bb0fb" containerName="collect-profiles" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.321725 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="75629956-e407-4638-90cd-fd2f907bb0fb" containerName="collect-profiles" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.322399 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.328648 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.333846 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.334375 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-47pqd"] Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.341404 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 17 11:15:54 crc kubenswrapper[4742]: W0317 11:15:54.361453 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod067d9a25_2589_43d1_ac71_b7fbe6416a31.slice/crio-e5fb1531b6dba8c21de6b2efd9d6e4ef2699a9d2273ac3efd43de67e07a3bf7d WatchSource:0}: Error finding container e5fb1531b6dba8c21de6b2efd9d6e4ef2699a9d2273ac3efd43de67e07a3bf7d: Status 404 returned error can't find the container with id e5fb1531b6dba8c21de6b2efd9d6e4ef2699a9d2273ac3efd43de67e07a3bf7d Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.444982 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24946b1f-6d3e-457c-b78f-213f94b2b650-catalog-content\") pod \"redhat-operators-47pqd\" (UID: \"24946b1f-6d3e-457c-b78f-213f94b2b650\") " pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.445125 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbjf9\" (UniqueName: \"kubernetes.io/projected/24946b1f-6d3e-457c-b78f-213f94b2b650-kube-api-access-wbjf9\") pod \"redhat-operators-47pqd\" (UID: \"24946b1f-6d3e-457c-b78f-213f94b2b650\") " pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.445160 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24946b1f-6d3e-457c-b78f-213f94b2b650-utilities\") pod \"redhat-operators-47pqd\" (UID: \"24946b1f-6d3e-457c-b78f-213f94b2b650\") " pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.546198 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24946b1f-6d3e-457c-b78f-213f94b2b650-catalog-content\") pod \"redhat-operators-47pqd\" (UID: \"24946b1f-6d3e-457c-b78f-213f94b2b650\") " pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.546302 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbjf9\" (UniqueName: \"kubernetes.io/projected/24946b1f-6d3e-457c-b78f-213f94b2b650-kube-api-access-wbjf9\") pod \"redhat-operators-47pqd\" (UID: \"24946b1f-6d3e-457c-b78f-213f94b2b650\") " pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.546327 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24946b1f-6d3e-457c-b78f-213f94b2b650-utilities\") pod \"redhat-operators-47pqd\" (UID: \"24946b1f-6d3e-457c-b78f-213f94b2b650\") " pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.546680 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24946b1f-6d3e-457c-b78f-213f94b2b650-catalog-content\") pod \"redhat-operators-47pqd\" (UID: \"24946b1f-6d3e-457c-b78f-213f94b2b650\") " pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.546735 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24946b1f-6d3e-457c-b78f-213f94b2b650-utilities\") pod \"redhat-operators-47pqd\" (UID: \"24946b1f-6d3e-457c-b78f-213f94b2b650\") " pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.587072 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbjf9\" (UniqueName: \"kubernetes.io/projected/24946b1f-6d3e-457c-b78f-213f94b2b650-kube-api-access-wbjf9\") pod \"redhat-operators-47pqd\" (UID: \"24946b1f-6d3e-457c-b78f-213f94b2b650\") " pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.596741 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.644162 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.676254 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="497b1f19-025b-4b65-b062-b4a94eec3cfc" path="/var/lib/kubelet/pods/497b1f19-025b-4b65-b062-b4a94eec3cfc/volumes" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.677044 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ee15c68-88ae-4ca8-b3d5-94266082d7ba" path="/var/lib/kubelet/pods/5ee15c68-88ae-4ca8-b3d5-94266082d7ba/volumes" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.677733 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.740577 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f948n"] Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.741999 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.756692 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f948n"] Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.778751 4742 ???:1] "http: TLS handshake error from 192.168.126.11:37782: no serving certificate available for the kubelet" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.818823 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.823207 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-52v8r" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.852530 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f02c3898-7b15-4f51-a0ac-45a077355791-catalog-content\") pod \"redhat-operators-f948n\" (UID: \"f02c3898-7b15-4f51-a0ac-45a077355791\") " pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.852935 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f02c3898-7b15-4f51-a0ac-45a077355791-utilities\") pod \"redhat-operators-f948n\" (UID: \"f02c3898-7b15-4f51-a0ac-45a077355791\") " pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.853017 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp7v8\" (UniqueName: \"kubernetes.io/projected/f02c3898-7b15-4f51-a0ac-45a077355791-kube-api-access-wp7v8\") pod \"redhat-operators-f948n\" (UID: \"f02c3898-7b15-4f51-a0ac-45a077355791\") " pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.856929 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" event={"ID":"75629956-e407-4638-90cd-fd2f907bb0fb","Type":"ContainerDied","Data":"393835b451ec6f783a5019e97a9bb84d3b467277b57706160731fa596e7c0507"} Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.856957 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.856962 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="393835b451ec6f783a5019e97a9bb84d3b467277b57706160731fa596e7c0507" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.863534 4742 generic.go:334] "Generic (PLEG): container finished" podID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" containerID="777778a0954a81b2fc2dc89e3f6cdef1faeae71a25103e1258b624d944e961bf" exitCode=0 Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.863578 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mvgb" event={"ID":"d416e1fd-2137-48a4-b933-b25a9ca94a8a","Type":"ContainerDied","Data":"777778a0954a81b2fc2dc89e3f6cdef1faeae71a25103e1258b624d944e961bf"} Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.863652 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mvgb" event={"ID":"d416e1fd-2137-48a4-b933-b25a9ca94a8a","Type":"ContainerStarted","Data":"782e31003da6e9c2ce12c25c7cfdb411809080f461c17ec85b30bd2d0a523261"} Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.864870 4742 generic.go:334] "Generic (PLEG): container finished" podID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" containerID="3bbcdce06bb4648cd2f4a95914e39856ee89b537e1ba9c1dd1da9b4c348c9536" exitCode=0 Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.864928 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p27vh" event={"ID":"ce3a51df-d6e4-46d5-95e3-8be6aaba196f","Type":"ContainerDied","Data":"3bbcdce06bb4648cd2f4a95914e39856ee89b537e1ba9c1dd1da9b4c348c9536"} Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.864951 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p27vh" event={"ID":"ce3a51df-d6e4-46d5-95e3-8be6aaba196f","Type":"ContainerStarted","Data":"970e14da973914f33acd8fc38cc0abd459e88f062e298899a6e3945894a002c7"} Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.869611 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e1a8e201-d6d6-49f0-885e-615ceffeae2f","Type":"ContainerStarted","Data":"8e178ddc78aa933e5668d081096c3666b75cbbc7e604efea7a31b07449552141"} Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.884107 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"067d9a25-2589-43d1-ac71-b7fbe6416a31","Type":"ContainerStarted","Data":"0cadaa258b189f4ff25ab413f92ce7c16698ec89cb46d2f8b4b36a0bcd80ba13"} Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.884138 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"067d9a25-2589-43d1-ac71-b7fbe6416a31","Type":"ContainerStarted","Data":"e5fb1531b6dba8c21de6b2efd9d6e4ef2699a9d2273ac3efd43de67e07a3bf7d"} Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.912661 4742 patch_prober.go:28] interesting pod/downloads-7954f5f757-s5z9r container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.912710 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-s5z9r" podUID="0de428d9-1755-4c28-8c6e-cbb115aef7c7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.915541 4742 patch_prober.go:28] interesting pod/downloads-7954f5f757-s5z9r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.915569 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-s5z9r" podUID="0de428d9-1755-4c28-8c6e-cbb115aef7c7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.928196 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.928385 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.954811 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f02c3898-7b15-4f51-a0ac-45a077355791-catalog-content\") pod \"redhat-operators-f948n\" (UID: \"f02c3898-7b15-4f51-a0ac-45a077355791\") " pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.957947 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f02c3898-7b15-4f51-a0ac-45a077355791-utilities\") pod \"redhat-operators-f948n\" (UID: \"f02c3898-7b15-4f51-a0ac-45a077355791\") " pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.958145 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp7v8\" (UniqueName: \"kubernetes.io/projected/f02c3898-7b15-4f51-a0ac-45a077355791-kube-api-access-wp7v8\") pod \"redhat-operators-f948n\" (UID: \"f02c3898-7b15-4f51-a0ac-45a077355791\") " pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.960173 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f02c3898-7b15-4f51-a0ac-45a077355791-catalog-content\") pod \"redhat-operators-f948n\" (UID: \"f02c3898-7b15-4f51-a0ac-45a077355791\") " pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.961290 4742 patch_prober.go:28] interesting pod/console-f9d7485db-lfdfp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.961348 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-lfdfp" podUID="dcb66d58-3d7a-47db-b3ff-2ede326cbe34" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 17 11:15:54 crc kubenswrapper[4742]: I0317 11:15:54.980190 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f02c3898-7b15-4f51-a0ac-45a077355791-utilities\") pod \"redhat-operators-f948n\" (UID: \"f02c3898-7b15-4f51-a0ac-45a077355791\") " pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.020530 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.020505711 podStartE2EDuration="2.020505711s" podCreationTimestamp="2026-03-17 11:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:55.013605771 +0000 UTC m=+258.139733529" watchObservedRunningTime="2026-03-17 11:15:55.020505711 +0000 UTC m=+258.146633469" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.028125 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp7v8\" (UniqueName: \"kubernetes.io/projected/f02c3898-7b15-4f51-a0ac-45a077355791-kube-api-access-wp7v8\") pod \"redhat-operators-f948n\" (UID: \"f02c3898-7b15-4f51-a0ac-45a077355791\") " pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.059612 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.144242 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.144276 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.171596 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.182552 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.185746 4742 patch_prober.go:28] interesting pod/router-default-5444994796-hwx7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 11:15:55 crc kubenswrapper[4742]: [-]has-synced failed: reason withheld Mar 17 11:15:55 crc kubenswrapper[4742]: [+]process-running ok Mar 17 11:15:55 crc kubenswrapper[4742]: healthz check failed Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.185803 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hwx7f" podUID="2800a131-02e6-49f1-9385-6065c4b4216e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.215086 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.275711 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-47pqd"] Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.416507 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd"] Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.417244 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.427895 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd"] Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.427944 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.428180 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.428628 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.428743 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.428773 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.428851 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.479019 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-client-ca\") pod \"route-controller-manager-584bc7856-7vwtd\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.479087 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-config\") pod \"route-controller-manager-584bc7856-7vwtd\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.479119 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdzrd\" (UniqueName: \"kubernetes.io/projected/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-kube-api-access-kdzrd\") pod \"route-controller-manager-584bc7856-7vwtd\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.479209 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-serving-cert\") pod \"route-controller-manager-584bc7856-7vwtd\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.581116 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-serving-cert\") pod \"route-controller-manager-584bc7856-7vwtd\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.581209 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-client-ca\") pod \"route-controller-manager-584bc7856-7vwtd\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.581238 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-config\") pod \"route-controller-manager-584bc7856-7vwtd\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.581254 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdzrd\" (UniqueName: \"kubernetes.io/projected/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-kube-api-access-kdzrd\") pod \"route-controller-manager-584bc7856-7vwtd\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.585145 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-client-ca\") pod \"route-controller-manager-584bc7856-7vwtd\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.587428 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-config\") pod \"route-controller-manager-584bc7856-7vwtd\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.604895 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-serving-cert\") pod \"route-controller-manager-584bc7856-7vwtd\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.606830 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f948n"] Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.608958 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdzrd\" (UniqueName: \"kubernetes.io/projected/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-kube-api-access-kdzrd\") pod \"route-controller-manager-584bc7856-7vwtd\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.741014 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.915222 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f948n" event={"ID":"f02c3898-7b15-4f51-a0ac-45a077355791","Type":"ContainerStarted","Data":"5eab7d7ed9f56d14ec449087fda6347e692d36ff14a0c81de1a7cf14f289f796"} Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.915331 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f948n" event={"ID":"f02c3898-7b15-4f51-a0ac-45a077355791","Type":"ContainerStarted","Data":"1bf6dbad41f7201f3167ca65e63d5ef269bfbf4cb39e3f47c6a2156f060da085"} Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.917629 4742 generic.go:334] "Generic (PLEG): container finished" podID="24946b1f-6d3e-457c-b78f-213f94b2b650" containerID="bd145430c6009412a8ca4df21318b658485841e313380d1acbef3f19805746b3" exitCode=0 Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.917689 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47pqd" event={"ID":"24946b1f-6d3e-457c-b78f-213f94b2b650","Type":"ContainerDied","Data":"bd145430c6009412a8ca4df21318b658485841e313380d1acbef3f19805746b3"} Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.917761 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47pqd" event={"ID":"24946b1f-6d3e-457c-b78f-213f94b2b650","Type":"ContainerStarted","Data":"21f2d5a23f955d199378a6a91571fa9740e60d4046211aed7df111b8fc86ee3a"} Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.921607 4742 generic.go:334] "Generic (PLEG): container finished" podID="e1a8e201-d6d6-49f0-885e-615ceffeae2f" containerID="96d3f797382e55d1d494e4874f67d144e0cf9d654ea84b7e4721f9e3d3163f66" exitCode=0 Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.921660 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e1a8e201-d6d6-49f0-885e-615ceffeae2f","Type":"ContainerDied","Data":"96d3f797382e55d1d494e4874f67d144e0cf9d654ea84b7e4721f9e3d3163f66"} Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.924493 4742 generic.go:334] "Generic (PLEG): container finished" podID="067d9a25-2589-43d1-ac71-b7fbe6416a31" containerID="0cadaa258b189f4ff25ab413f92ce7c16698ec89cb46d2f8b4b36a0bcd80ba13" exitCode=0 Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.924547 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"067d9a25-2589-43d1-ac71-b7fbe6416a31","Type":"ContainerDied","Data":"0cadaa258b189f4ff25ab413f92ce7c16698ec89cb46d2f8b4b36a0bcd80ba13"} Mar 17 11:15:55 crc kubenswrapper[4742]: I0317 11:15:55.929341 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hmmrg" Mar 17 11:15:56 crc kubenswrapper[4742]: I0317 11:15:56.189191 4742 patch_prober.go:28] interesting pod/router-default-5444994796-hwx7f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 17 11:15:56 crc kubenswrapper[4742]: [-]has-synced failed: reason withheld Mar 17 11:15:56 crc kubenswrapper[4742]: [+]process-running ok Mar 17 11:15:56 crc kubenswrapper[4742]: healthz check failed Mar 17 11:15:56 crc kubenswrapper[4742]: I0317 11:15:56.190384 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hwx7f" podUID="2800a131-02e6-49f1-9385-6065c4b4216e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 17 11:15:56 crc kubenswrapper[4742]: I0317 11:15:56.393138 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd"] Mar 17 11:15:56 crc kubenswrapper[4742]: I0317 11:15:56.975520 4742 generic.go:334] "Generic (PLEG): container finished" podID="f02c3898-7b15-4f51-a0ac-45a077355791" containerID="5eab7d7ed9f56d14ec449087fda6347e692d36ff14a0c81de1a7cf14f289f796" exitCode=0 Mar 17 11:15:56 crc kubenswrapper[4742]: I0317 11:15:56.975684 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f948n" event={"ID":"f02c3898-7b15-4f51-a0ac-45a077355791","Type":"ContainerDied","Data":"5eab7d7ed9f56d14ec449087fda6347e692d36ff14a0c81de1a7cf14f289f796"} Mar 17 11:15:56 crc kubenswrapper[4742]: I0317 11:15:56.982921 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" event={"ID":"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7","Type":"ContainerStarted","Data":"ff093dfc474deeaef8ccb39f056af7c881830082a5617979af5d165fc695fa05"} Mar 17 11:15:56 crc kubenswrapper[4742]: I0317 11:15:56.982955 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" event={"ID":"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7","Type":"ContainerStarted","Data":"b4f8e48bc01e31213bfc94bfc8bc6814a4ca817caa4b8e3fe5799cefe55b5c77"} Mar 17 11:15:56 crc kubenswrapper[4742]: I0317 11:15:56.983651 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:57 crc kubenswrapper[4742]: I0317 11:15:57.006208 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" podStartSLOduration=7.006189845 podStartE2EDuration="7.006189845s" podCreationTimestamp="2026-03-17 11:15:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:15:57.004018221 +0000 UTC m=+260.130145979" watchObservedRunningTime="2026-03-17 11:15:57.006189845 +0000 UTC m=+260.132317593" Mar 17 11:15:57 crc kubenswrapper[4742]: I0317 11:15:57.194577 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:57 crc kubenswrapper[4742]: I0317 11:15:57.196928 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-hwx7f" Mar 17 11:15:57 crc kubenswrapper[4742]: I0317 11:15:57.211162 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:15:58 crc kubenswrapper[4742]: I0317 11:15:58.883628 4742 ???:1] "http: TLS handshake error from 192.168.126.11:43588: no serving certificate available for the kubelet" Mar 17 11:15:59 crc kubenswrapper[4742]: I0317 11:15:59.922807 4742 ???:1] "http: TLS handshake error from 192.168.126.11:43602: no serving certificate available for the kubelet" Mar 17 11:16:00 crc kubenswrapper[4742]: I0317 11:16:00.133930 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562436-cnnrt"] Mar 17 11:16:00 crc kubenswrapper[4742]: I0317 11:16:00.134566 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562436-cnnrt" Mar 17 11:16:00 crc kubenswrapper[4742]: I0317 11:16:00.141257 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:16:00 crc kubenswrapper[4742]: I0317 11:16:00.142610 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562436-cnnrt"] Mar 17 11:16:00 crc kubenswrapper[4742]: I0317 11:16:00.291550 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bql4m\" (UniqueName: \"kubernetes.io/projected/f99ba73f-1688-43ea-9538-bc7623c02521-kube-api-access-bql4m\") pod \"auto-csr-approver-29562436-cnnrt\" (UID: \"f99ba73f-1688-43ea-9538-bc7623c02521\") " pod="openshift-infra/auto-csr-approver-29562436-cnnrt" Mar 17 11:16:00 crc kubenswrapper[4742]: I0317 11:16:00.395059 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bql4m\" (UniqueName: \"kubernetes.io/projected/f99ba73f-1688-43ea-9538-bc7623c02521-kube-api-access-bql4m\") pod \"auto-csr-approver-29562436-cnnrt\" (UID: \"f99ba73f-1688-43ea-9538-bc7623c02521\") " pod="openshift-infra/auto-csr-approver-29562436-cnnrt" Mar 17 11:16:00 crc kubenswrapper[4742]: I0317 11:16:00.412763 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bql4m\" (UniqueName: \"kubernetes.io/projected/f99ba73f-1688-43ea-9538-bc7623c02521-kube-api-access-bql4m\") pod \"auto-csr-approver-29562436-cnnrt\" (UID: \"f99ba73f-1688-43ea-9538-bc7623c02521\") " pod="openshift-infra/auto-csr-approver-29562436-cnnrt" Mar 17 11:16:00 crc kubenswrapper[4742]: I0317 11:16:00.467086 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562436-cnnrt" Mar 17 11:16:00 crc kubenswrapper[4742]: I0317 11:16:00.973997 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-58scf" Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.564374 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.569020 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.667252 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/067d9a25-2589-43d1-ac71-b7fbe6416a31-kube-api-access\") pod \"067d9a25-2589-43d1-ac71-b7fbe6416a31\" (UID: \"067d9a25-2589-43d1-ac71-b7fbe6416a31\") " Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.667342 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1a8e201-d6d6-49f0-885e-615ceffeae2f-kubelet-dir\") pod \"e1a8e201-d6d6-49f0-885e-615ceffeae2f\" (UID: \"e1a8e201-d6d6-49f0-885e-615ceffeae2f\") " Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.667390 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/067d9a25-2589-43d1-ac71-b7fbe6416a31-kubelet-dir\") pod \"067d9a25-2589-43d1-ac71-b7fbe6416a31\" (UID: \"067d9a25-2589-43d1-ac71-b7fbe6416a31\") " Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.667409 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1a8e201-d6d6-49f0-885e-615ceffeae2f-kube-api-access\") pod \"e1a8e201-d6d6-49f0-885e-615ceffeae2f\" (UID: \"e1a8e201-d6d6-49f0-885e-615ceffeae2f\") " Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.667483 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1a8e201-d6d6-49f0-885e-615ceffeae2f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e1a8e201-d6d6-49f0-885e-615ceffeae2f" (UID: "e1a8e201-d6d6-49f0-885e-615ceffeae2f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.667511 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/067d9a25-2589-43d1-ac71-b7fbe6416a31-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "067d9a25-2589-43d1-ac71-b7fbe6416a31" (UID: "067d9a25-2589-43d1-ac71-b7fbe6416a31"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.667799 4742 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1a8e201-d6d6-49f0-885e-615ceffeae2f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.667813 4742 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/067d9a25-2589-43d1-ac71-b7fbe6416a31-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.672789 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a8e201-d6d6-49f0-885e-615ceffeae2f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e1a8e201-d6d6-49f0-885e-615ceffeae2f" (UID: "e1a8e201-d6d6-49f0-885e-615ceffeae2f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.676634 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/067d9a25-2589-43d1-ac71-b7fbe6416a31-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "067d9a25-2589-43d1-ac71-b7fbe6416a31" (UID: "067d9a25-2589-43d1-ac71-b7fbe6416a31"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.777024 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1a8e201-d6d6-49f0-885e-615ceffeae2f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.777053 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/067d9a25-2589-43d1-ac71-b7fbe6416a31-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.931215 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-s5z9r" Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.933830 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:16:04 crc kubenswrapper[4742]: I0317 11:16:04.940854 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:16:05 crc kubenswrapper[4742]: I0317 11:16:05.066873 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e1a8e201-d6d6-49f0-885e-615ceffeae2f","Type":"ContainerDied","Data":"8e178ddc78aa933e5668d081096c3666b75cbbc7e604efea7a31b07449552141"} Mar 17 11:16:05 crc kubenswrapper[4742]: I0317 11:16:05.066934 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e178ddc78aa933e5668d081096c3666b75cbbc7e604efea7a31b07449552141" Mar 17 11:16:05 crc kubenswrapper[4742]: I0317 11:16:05.067039 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 17 11:16:05 crc kubenswrapper[4742]: I0317 11:16:05.068704 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 17 11:16:05 crc kubenswrapper[4742]: I0317 11:16:05.069039 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"067d9a25-2589-43d1-ac71-b7fbe6416a31","Type":"ContainerDied","Data":"e5fb1531b6dba8c21de6b2efd9d6e4ef2699a9d2273ac3efd43de67e07a3bf7d"} Mar 17 11:16:05 crc kubenswrapper[4742]: I0317 11:16:05.069079 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5fb1531b6dba8c21de6b2efd9d6e4ef2699a9d2273ac3efd43de67e07a3bf7d" Mar 17 11:16:06 crc kubenswrapper[4742]: E0317 11:16:06.825163 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 17 11:16:06 crc kubenswrapper[4742]: E0317 11:16:06.825326 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:16:06 crc kubenswrapper[4742]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 17 11:16:06 crc kubenswrapper[4742]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-86f6s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29562434-wtx87_openshift-infra(5b3a8612-a5db-4ec8-9873-32829e2fe69e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 17 11:16:06 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:16:06 crc kubenswrapper[4742]: E0317 11:16:06.826508 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29562434-wtx87" podUID="5b3a8612-a5db-4ec8-9873-32829e2fe69e" Mar 17 11:16:07 crc kubenswrapper[4742]: E0317 11:16:07.084270 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29562434-wtx87" podUID="5b3a8612-a5db-4ec8-9873-32829e2fe69e" Mar 17 11:16:09 crc kubenswrapper[4742]: I0317 11:16:09.031766 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b7568f775-wqc5s"] Mar 17 11:16:09 crc kubenswrapper[4742]: I0317 11:16:09.032477 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" podUID="af781729-fe84-4dcf-afed-cb56769bf4ca" containerName="controller-manager" containerID="cri-o://68e472ab1a0d17f93ce017abb5eef75c7bb0ec16947bc450b6597f73972f53ef" gracePeriod=30 Mar 17 11:16:09 crc kubenswrapper[4742]: I0317 11:16:09.055251 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd"] Mar 17 11:16:09 crc kubenswrapper[4742]: I0317 11:16:09.055426 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" podUID="e4a20910-9c0a-48de-a5f5-9b18dad6e1d7" containerName="route-controller-manager" containerID="cri-o://ff093dfc474deeaef8ccb39f056af7c881830082a5617979af5d165fc695fa05" gracePeriod=30 Mar 17 11:16:09 crc kubenswrapper[4742]: I0317 11:16:09.971052 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:16:09 crc kubenswrapper[4742]: I0317 11:16:09.975592 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.093676 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-config\") pod \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.093746 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-client-ca\") pod \"af781729-fe84-4dcf-afed-cb56769bf4ca\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.093784 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af781729-fe84-4dcf-afed-cb56769bf4ca-serving-cert\") pod \"af781729-fe84-4dcf-afed-cb56769bf4ca\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.093840 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-config\") pod \"af781729-fe84-4dcf-afed-cb56769bf4ca\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.093868 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqvtk\" (UniqueName: \"kubernetes.io/projected/af781729-fe84-4dcf-afed-cb56769bf4ca-kube-api-access-pqvtk\") pod \"af781729-fe84-4dcf-afed-cb56769bf4ca\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.093894 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdzrd\" (UniqueName: \"kubernetes.io/projected/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-kube-api-access-kdzrd\") pod \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.093968 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-client-ca\") pod \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.093993 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-proxy-ca-bundles\") pod \"af781729-fe84-4dcf-afed-cb56769bf4ca\" (UID: \"af781729-fe84-4dcf-afed-cb56769bf4ca\") " Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.094048 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-serving-cert\") pod \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\" (UID: \"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7\") " Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.096334 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-client-ca" (OuterVolumeSpecName: "client-ca") pod "e4a20910-9c0a-48de-a5f5-9b18dad6e1d7" (UID: "e4a20910-9c0a-48de-a5f5-9b18dad6e1d7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.096397 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-config" (OuterVolumeSpecName: "config") pod "e4a20910-9c0a-48de-a5f5-9b18dad6e1d7" (UID: "e4a20910-9c0a-48de-a5f5-9b18dad6e1d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.096714 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-client-ca" (OuterVolumeSpecName: "client-ca") pod "af781729-fe84-4dcf-afed-cb56769bf4ca" (UID: "af781729-fe84-4dcf-afed-cb56769bf4ca"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.096752 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "af781729-fe84-4dcf-afed-cb56769bf4ca" (UID: "af781729-fe84-4dcf-afed-cb56769bf4ca"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.096837 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-config" (OuterVolumeSpecName: "config") pod "af781729-fe84-4dcf-afed-cb56769bf4ca" (UID: "af781729-fe84-4dcf-afed-cb56769bf4ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.100206 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af781729-fe84-4dcf-afed-cb56769bf4ca-kube-api-access-pqvtk" (OuterVolumeSpecName: "kube-api-access-pqvtk") pod "af781729-fe84-4dcf-afed-cb56769bf4ca" (UID: "af781729-fe84-4dcf-afed-cb56769bf4ca"). InnerVolumeSpecName "kube-api-access-pqvtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.101077 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af781729-fe84-4dcf-afed-cb56769bf4ca-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "af781729-fe84-4dcf-afed-cb56769bf4ca" (UID: "af781729-fe84-4dcf-afed-cb56769bf4ca"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.104762 4742 generic.go:334] "Generic (PLEG): container finished" podID="af781729-fe84-4dcf-afed-cb56769bf4ca" containerID="68e472ab1a0d17f93ce017abb5eef75c7bb0ec16947bc450b6597f73972f53ef" exitCode=0 Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.104963 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" event={"ID":"af781729-fe84-4dcf-afed-cb56769bf4ca","Type":"ContainerDied","Data":"68e472ab1a0d17f93ce017abb5eef75c7bb0ec16947bc450b6597f73972f53ef"} Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.105024 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" event={"ID":"af781729-fe84-4dcf-afed-cb56769bf4ca","Type":"ContainerDied","Data":"d525baf062858770152386a085e86efda2240d690c2c7748b457a439e06b8c45"} Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.105047 4742 scope.go:117] "RemoveContainer" containerID="68e472ab1a0d17f93ce017abb5eef75c7bb0ec16947bc450b6597f73972f53ef" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.105239 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b7568f775-wqc5s" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.106574 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e4a20910-9c0a-48de-a5f5-9b18dad6e1d7" (UID: "e4a20910-9c0a-48de-a5f5-9b18dad6e1d7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.108465 4742 generic.go:334] "Generic (PLEG): container finished" podID="e4a20910-9c0a-48de-a5f5-9b18dad6e1d7" containerID="ff093dfc474deeaef8ccb39f056af7c881830082a5617979af5d165fc695fa05" exitCode=0 Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.108505 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" event={"ID":"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7","Type":"ContainerDied","Data":"ff093dfc474deeaef8ccb39f056af7c881830082a5617979af5d165fc695fa05"} Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.108540 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" event={"ID":"e4a20910-9c0a-48de-a5f5-9b18dad6e1d7","Type":"ContainerDied","Data":"b4f8e48bc01e31213bfc94bfc8bc6814a4ca817caa4b8e3fe5799cefe55b5c77"} Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.108599 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.112633 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-kube-api-access-kdzrd" (OuterVolumeSpecName: "kube-api-access-kdzrd") pod "e4a20910-9c0a-48de-a5f5-9b18dad6e1d7" (UID: "e4a20910-9c0a-48de-a5f5-9b18dad6e1d7"). InnerVolumeSpecName "kube-api-access-kdzrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.143256 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b7568f775-wqc5s"] Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.148821 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b7568f775-wqc5s"] Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.196025 4742 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.196060 4742 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.196072 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.196083 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.196095 4742 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.196103 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af781729-fe84-4dcf-afed-cb56769bf4ca-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.196111 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af781729-fe84-4dcf-afed-cb56769bf4ca-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.196119 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqvtk\" (UniqueName: \"kubernetes.io/projected/af781729-fe84-4dcf-afed-cb56769bf4ca-kube-api-access-pqvtk\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.196127 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdzrd\" (UniqueName: \"kubernetes.io/projected/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7-kube-api-access-kdzrd\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.425419 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2"] Mar 17 11:16:10 crc kubenswrapper[4742]: E0317 11:16:10.429891 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a20910-9c0a-48de-a5f5-9b18dad6e1d7" containerName="route-controller-manager" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.429930 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a20910-9c0a-48de-a5f5-9b18dad6e1d7" containerName="route-controller-manager" Mar 17 11:16:10 crc kubenswrapper[4742]: E0317 11:16:10.429940 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a8e201-d6d6-49f0-885e-615ceffeae2f" containerName="pruner" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.429946 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a8e201-d6d6-49f0-885e-615ceffeae2f" containerName="pruner" Mar 17 11:16:10 crc kubenswrapper[4742]: E0317 11:16:10.429959 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af781729-fe84-4dcf-afed-cb56769bf4ca" containerName="controller-manager" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.429966 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="af781729-fe84-4dcf-afed-cb56769bf4ca" containerName="controller-manager" Mar 17 11:16:10 crc kubenswrapper[4742]: E0317 11:16:10.429981 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067d9a25-2589-43d1-ac71-b7fbe6416a31" containerName="pruner" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.429992 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="067d9a25-2589-43d1-ac71-b7fbe6416a31" containerName="pruner" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.430491 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a8e201-d6d6-49f0-885e-615ceffeae2f" containerName="pruner" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.430504 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a20910-9c0a-48de-a5f5-9b18dad6e1d7" containerName="route-controller-manager" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.430514 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="067d9a25-2589-43d1-ac71-b7fbe6416a31" containerName="pruner" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.430522 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="af781729-fe84-4dcf-afed-cb56769bf4ca" containerName="controller-manager" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.431148 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.435853 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm"] Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.436548 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.436689 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.436885 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.436914 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.437404 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.437947 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.441297 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.441814 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.441978 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.442115 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.442277 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.443237 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.444583 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.444709 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2"] Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.445590 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.446993 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm"] Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.513699 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwswd\" (UniqueName: \"kubernetes.io/projected/a8cff413-5a40-4a16-a24c-3c0f82377789-kube-api-access-xwswd\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.513766 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8cff413-5a40-4a16-a24c-3c0f82377789-serving-cert\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.513813 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-proxy-ca-bundles\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.513966 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/777c45b3-3911-46ac-a314-db38bbed187a-client-ca\") pod \"route-controller-manager-dcbcd868-d8stm\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.514000 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-client-ca\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.514017 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/777c45b3-3911-46ac-a314-db38bbed187a-serving-cert\") pod \"route-controller-manager-dcbcd868-d8stm\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.514174 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-config\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.514200 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn422\" (UniqueName: \"kubernetes.io/projected/777c45b3-3911-46ac-a314-db38bbed187a-kube-api-access-pn422\") pod \"route-controller-manager-dcbcd868-d8stm\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.514236 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/777c45b3-3911-46ac-a314-db38bbed187a-config\") pod \"route-controller-manager-dcbcd868-d8stm\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.515325 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd"] Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.519344 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-584bc7856-7vwtd"] Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.615271 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-client-ca\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.615310 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/777c45b3-3911-46ac-a314-db38bbed187a-serving-cert\") pod \"route-controller-manager-dcbcd868-d8stm\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.615352 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-config\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.615371 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn422\" (UniqueName: \"kubernetes.io/projected/777c45b3-3911-46ac-a314-db38bbed187a-kube-api-access-pn422\") pod \"route-controller-manager-dcbcd868-d8stm\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.616354 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-client-ca\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.616712 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-config\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.616860 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/777c45b3-3911-46ac-a314-db38bbed187a-config\") pod \"route-controller-manager-dcbcd868-d8stm\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.616884 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwswd\" (UniqueName: \"kubernetes.io/projected/a8cff413-5a40-4a16-a24c-3c0f82377789-kube-api-access-xwswd\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.616904 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8cff413-5a40-4a16-a24c-3c0f82377789-serving-cert\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.616976 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-proxy-ca-bundles\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.617025 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/777c45b3-3911-46ac-a314-db38bbed187a-client-ca\") pod \"route-controller-manager-dcbcd868-d8stm\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.619319 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/777c45b3-3911-46ac-a314-db38bbed187a-config\") pod \"route-controller-manager-dcbcd868-d8stm\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.619668 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/777c45b3-3911-46ac-a314-db38bbed187a-serving-cert\") pod \"route-controller-manager-dcbcd868-d8stm\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.619712 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-proxy-ca-bundles\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.623811 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/777c45b3-3911-46ac-a314-db38bbed187a-client-ca\") pod \"route-controller-manager-dcbcd868-d8stm\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.634557 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn422\" (UniqueName: \"kubernetes.io/projected/777c45b3-3911-46ac-a314-db38bbed187a-kube-api-access-pn422\") pod \"route-controller-manager-dcbcd868-d8stm\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.635139 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8cff413-5a40-4a16-a24c-3c0f82377789-serving-cert\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.640474 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwswd\" (UniqueName: \"kubernetes.io/projected/a8cff413-5a40-4a16-a24c-3c0f82377789-kube-api-access-xwswd\") pod \"controller-manager-54d6b5c6f8-gr8l2\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.673459 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af781729-fe84-4dcf-afed-cb56769bf4ca" path="/var/lib/kubelet/pods/af781729-fe84-4dcf-afed-cb56769bf4ca/volumes" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.674065 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a20910-9c0a-48de-a5f5-9b18dad6e1d7" path="/var/lib/kubelet/pods/e4a20910-9c0a-48de-a5f5-9b18dad6e1d7/volumes" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.805742 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:10 crc kubenswrapper[4742]: I0317 11:16:10.813204 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:12 crc kubenswrapper[4742]: I0317 11:16:12.675860 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:16:13 crc kubenswrapper[4742]: I0317 11:16:13.233528 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562436-cnnrt"] Mar 17 11:16:18 crc kubenswrapper[4742]: I0317 11:16:18.044635 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:16:18 crc kubenswrapper[4742]: I0317 11:16:18.044983 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:16:19 crc kubenswrapper[4742]: E0317 11:16:19.467274 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 17 11:16:19 crc kubenswrapper[4742]: E0317 11:16:19.467469 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w5sxs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5v4hw_openshift-marketplace(72e6f877-4431-46ba-8c22-0479a383851b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 11:16:19 crc kubenswrapper[4742]: E0317 11:16:19.468780 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5v4hw" podUID="72e6f877-4431-46ba-8c22-0479a383851b" Mar 17 11:16:20 crc kubenswrapper[4742]: I0317 11:16:20.431281 4742 ???:1] "http: TLS handshake error from 192.168.126.11:36870: no serving certificate available for the kubelet" Mar 17 11:16:23 crc kubenswrapper[4742]: E0317 11:16:23.034438 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 17 11:16:23 crc kubenswrapper[4742]: E0317 11:16:23.034923 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s9xsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6mvgb_openshift-marketplace(d416e1fd-2137-48a4-b933-b25a9ca94a8a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 11:16:23 crc kubenswrapper[4742]: E0317 11:16:23.036144 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6mvgb" podUID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" Mar 17 11:16:23 crc kubenswrapper[4742]: E0317 11:16:23.077506 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 17 11:16:23 crc kubenswrapper[4742]: E0317 11:16:23.077687 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkglr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-p27vh_openshift-marketplace(ce3a51df-d6e4-46d5-95e3-8be6aaba196f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 11:16:23 crc kubenswrapper[4742]: E0317 11:16:23.078965 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-p27vh" podUID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" Mar 17 11:16:24 crc kubenswrapper[4742]: I0317 11:16:24.500652 4742 scope.go:117] "RemoveContainer" containerID="68e472ab1a0d17f93ce017abb5eef75c7bb0ec16947bc450b6597f73972f53ef" Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.501349 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68e472ab1a0d17f93ce017abb5eef75c7bb0ec16947bc450b6597f73972f53ef\": container with ID starting with 68e472ab1a0d17f93ce017abb5eef75c7bb0ec16947bc450b6597f73972f53ef not found: ID does not exist" containerID="68e472ab1a0d17f93ce017abb5eef75c7bb0ec16947bc450b6597f73972f53ef" Mar 17 11:16:24 crc kubenswrapper[4742]: I0317 11:16:24.501401 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e472ab1a0d17f93ce017abb5eef75c7bb0ec16947bc450b6597f73972f53ef"} err="failed to get container status \"68e472ab1a0d17f93ce017abb5eef75c7bb0ec16947bc450b6597f73972f53ef\": rpc error: code = NotFound desc = could not find container \"68e472ab1a0d17f93ce017abb5eef75c7bb0ec16947bc450b6597f73972f53ef\": container with ID starting with 68e472ab1a0d17f93ce017abb5eef75c7bb0ec16947bc450b6597f73972f53ef not found: ID does not exist" Mar 17 11:16:24 crc kubenswrapper[4742]: I0317 11:16:24.501440 4742 scope.go:117] "RemoveContainer" containerID="ff093dfc474deeaef8ccb39f056af7c881830082a5617979af5d165fc695fa05" Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.513492 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-p27vh" podUID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.513550 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6mvgb" podUID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.513583 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5v4hw" podUID="72e6f877-4431-46ba-8c22-0479a383851b" Mar 17 11:16:24 crc kubenswrapper[4742]: W0317 11:16:24.513653 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf99ba73f_1688_43ea_9538_bc7623c02521.slice/crio-d25bf9c191dd996e77c75db24ce845e13e57af907b3295ac7cd4cc655dec79c6 WatchSource:0}: Error finding container d25bf9c191dd996e77c75db24ce845e13e57af907b3295ac7cd4cc655dec79c6: Status 404 returned error can't find the container with id d25bf9c191dd996e77c75db24ce845e13e57af907b3295ac7cd4cc655dec79c6 Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.556359 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.556766 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7w5nm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-cnzgk_openshift-marketplace(4b80c435-0e24-4ab2-980c-f2dfb1baef87): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.557927 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-cnzgk" podUID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.576303 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.576454 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p7nqr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-x6ffv_openshift-marketplace(e0543787-88e8-463d-b01b-694ecb854bfa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.578199 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-x6ffv" podUID="e0543787-88e8-463d-b01b-694ecb854bfa" Mar 17 11:16:24 crc kubenswrapper[4742]: I0317 11:16:24.592192 4742 scope.go:117] "RemoveContainer" containerID="ff093dfc474deeaef8ccb39f056af7c881830082a5617979af5d165fc695fa05" Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.592864 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff093dfc474deeaef8ccb39f056af7c881830082a5617979af5d165fc695fa05\": container with ID starting with ff093dfc474deeaef8ccb39f056af7c881830082a5617979af5d165fc695fa05 not found: ID does not exist" containerID="ff093dfc474deeaef8ccb39f056af7c881830082a5617979af5d165fc695fa05" Mar 17 11:16:24 crc kubenswrapper[4742]: I0317 11:16:24.592923 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff093dfc474deeaef8ccb39f056af7c881830082a5617979af5d165fc695fa05"} err="failed to get container status \"ff093dfc474deeaef8ccb39f056af7c881830082a5617979af5d165fc695fa05\": rpc error: code = NotFound desc = could not find container \"ff093dfc474deeaef8ccb39f056af7c881830082a5617979af5d165fc695fa05\": container with ID starting with ff093dfc474deeaef8ccb39f056af7c881830082a5617979af5d165fc695fa05 not found: ID does not exist" Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.635815 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.636050 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9fmck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qzdpj_openshift-marketplace(c8e4be20-e918-45d7-b026-12ef2abf3462): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 17 11:16:24 crc kubenswrapper[4742]: E0317 11:16:24.637268 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-qzdpj" podUID="c8e4be20-e918-45d7-b026-12ef2abf3462" Mar 17 11:16:24 crc kubenswrapper[4742]: I0317 11:16:24.769725 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm"] Mar 17 11:16:24 crc kubenswrapper[4742]: I0317 11:16:24.823106 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2"] Mar 17 11:16:25 crc kubenswrapper[4742]: W0317 11:16:25.000065 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8cff413_5a40_4a16_a24c_3c0f82377789.slice/crio-b965bc22e86b79dbf4e5ab99419b0bdeef15abedcae973f86171c3c2e15e0b6a WatchSource:0}: Error finding container b965bc22e86b79dbf4e5ab99419b0bdeef15abedcae973f86171c3c2e15e0b6a: Status 404 returned error can't find the container with id b965bc22e86b79dbf4e5ab99419b0bdeef15abedcae973f86171c3c2e15e0b6a Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.199344 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" event={"ID":"777c45b3-3911-46ac-a314-db38bbed187a","Type":"ContainerStarted","Data":"74992db9a05d1a47d681d3ec1ad6d0560e871725b20f6ca4704bfcdec0eb3952"} Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.199692 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" event={"ID":"777c45b3-3911-46ac-a314-db38bbed187a","Type":"ContainerStarted","Data":"c84fd19a66fc33c0dc60d4879e6a358eded1a534f2e1497e373c82d53efcdc97"} Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.201051 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.204657 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562436-cnnrt" event={"ID":"f99ba73f-1688-43ea-9538-bc7623c02521","Type":"ContainerStarted","Data":"d25bf9c191dd996e77c75db24ce845e13e57af907b3295ac7cd4cc655dec79c6"} Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.204884 4742 patch_prober.go:28] interesting pod/route-controller-manager-dcbcd868-d8stm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.204951 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" podUID="777c45b3-3911-46ac-a314-db38bbed187a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.212154 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47pqd" event={"ID":"24946b1f-6d3e-457c-b78f-213f94b2b650","Type":"ContainerStarted","Data":"731b497c91bb2f0675aa335ccb0442569123b028ddcba64c5f9f185f223355bf"} Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.222425 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" podStartSLOduration=16.222410576 podStartE2EDuration="16.222410576s" podCreationTimestamp="2026-03-17 11:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:16:25.218435179 +0000 UTC m=+288.344562967" watchObservedRunningTime="2026-03-17 11:16:25.222410576 +0000 UTC m=+288.348538344" Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.225950 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" event={"ID":"a8cff413-5a40-4a16-a24c-3c0f82377789","Type":"ContainerStarted","Data":"276497ab2e60ca33d58f227c0f97b2fad0bbb15ed850771e37c05ac7f8fdea8b"} Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.226000 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" event={"ID":"a8cff413-5a40-4a16-a24c-3c0f82377789","Type":"ContainerStarted","Data":"b965bc22e86b79dbf4e5ab99419b0bdeef15abedcae973f86171c3c2e15e0b6a"} Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.227086 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.227887 4742 patch_prober.go:28] interesting pod/controller-manager-54d6b5c6f8-gr8l2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.227951 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" podUID="a8cff413-5a40-4a16-a24c-3c0f82377789" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 17 11:16:25 crc kubenswrapper[4742]: E0317 11:16:25.230209 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qzdpj" podUID="c8e4be20-e918-45d7-b026-12ef2abf3462" Mar 17 11:16:25 crc kubenswrapper[4742]: E0317 11:16:25.230396 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-cnzgk" podUID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" Mar 17 11:16:25 crc kubenswrapper[4742]: E0317 11:16:25.232649 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-x6ffv" podUID="e0543787-88e8-463d-b01b-694ecb854bfa" Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.300261 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" podStartSLOduration=16.300240695 podStartE2EDuration="16.300240695s" podCreationTimestamp="2026-03-17 11:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:16:25.298373779 +0000 UTC m=+288.424501537" watchObservedRunningTime="2026-03-17 11:16:25.300240695 +0000 UTC m=+288.426368453" Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.910996 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cw5v4" Mar 17 11:16:25 crc kubenswrapper[4742]: I0317 11:16:25.999028 4742 csr.go:261] certificate signing request csr-nm6n4 is approved, waiting to be issued Mar 17 11:16:26 crc kubenswrapper[4742]: I0317 11:16:26.009516 4742 csr.go:257] certificate signing request csr-nm6n4 is issued Mar 17 11:16:26 crc kubenswrapper[4742]: I0317 11:16:26.247682 4742 generic.go:334] "Generic (PLEG): container finished" podID="f02c3898-7b15-4f51-a0ac-45a077355791" containerID="259c9959b88f99453c51da52b7903d287d34ddf059e5278610cc794fb5eb363c" exitCode=0 Mar 17 11:16:26 crc kubenswrapper[4742]: I0317 11:16:26.247782 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f948n" event={"ID":"f02c3898-7b15-4f51-a0ac-45a077355791","Type":"ContainerDied","Data":"259c9959b88f99453c51da52b7903d287d34ddf059e5278610cc794fb5eb363c"} Mar 17 11:16:26 crc kubenswrapper[4742]: I0317 11:16:26.267203 4742 generic.go:334] "Generic (PLEG): container finished" podID="24946b1f-6d3e-457c-b78f-213f94b2b650" containerID="731b497c91bb2f0675aa335ccb0442569123b028ddcba64c5f9f185f223355bf" exitCode=0 Mar 17 11:16:26 crc kubenswrapper[4742]: I0317 11:16:26.267265 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47pqd" event={"ID":"24946b1f-6d3e-457c-b78f-213f94b2b650","Type":"ContainerDied","Data":"731b497c91bb2f0675aa335ccb0442569123b028ddcba64c5f9f185f223355bf"} Mar 17 11:16:26 crc kubenswrapper[4742]: I0317 11:16:26.269644 4742 generic.go:334] "Generic (PLEG): container finished" podID="5b3a8612-a5db-4ec8-9873-32829e2fe69e" containerID="b43ac5c4c08e2b62cdaee4086a7c989df52d9423ccb4111fa8cb8bb2701e5648" exitCode=0 Mar 17 11:16:26 crc kubenswrapper[4742]: I0317 11:16:26.270545 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562434-wtx87" event={"ID":"5b3a8612-a5db-4ec8-9873-32829e2fe69e","Type":"ContainerDied","Data":"b43ac5c4c08e2b62cdaee4086a7c989df52d9423ccb4111fa8cb8bb2701e5648"} Mar 17 11:16:26 crc kubenswrapper[4742]: I0317 11:16:26.279245 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:26 crc kubenswrapper[4742]: I0317 11:16:26.280244 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.010986 4742 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-12 12:20:48.517167714 +0000 UTC Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.011668 4742 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7225h4m21.505503244s for next certificate rotation Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.289194 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f948n" event={"ID":"f02c3898-7b15-4f51-a0ac-45a077355791","Type":"ContainerStarted","Data":"47b6db3c9c5b4925825d5ce91fd9247083e8db6f3c1874f8c8c7520207d192e5"} Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.296100 4742 generic.go:334] "Generic (PLEG): container finished" podID="f99ba73f-1688-43ea-9538-bc7623c02521" containerID="3c8562a01c8b5d058ff3e1345ed1ecb4fae67d6e90929d34e54a3088d4ae1d5c" exitCode=0 Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.296228 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562436-cnnrt" event={"ID":"f99ba73f-1688-43ea-9538-bc7623c02521","Type":"ContainerDied","Data":"3c8562a01c8b5d058ff3e1345ed1ecb4fae67d6e90929d34e54a3088d4ae1d5c"} Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.300304 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47pqd" event={"ID":"24946b1f-6d3e-457c-b78f-213f94b2b650","Type":"ContainerStarted","Data":"00d7d9ffa3c91c1b26a7a94c4b95be9c369685a79036814e469d8c0dec03d7e0"} Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.317354 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f948n" podStartSLOduration=2.512876136 podStartE2EDuration="33.317336808s" podCreationTimestamp="2026-03-17 11:15:54 +0000 UTC" firstStartedPulling="2026-03-17 11:15:55.918796687 +0000 UTC m=+259.044924445" lastFinishedPulling="2026-03-17 11:16:26.723257359 +0000 UTC m=+289.849385117" observedRunningTime="2026-03-17 11:16:27.314980739 +0000 UTC m=+290.441108507" watchObservedRunningTime="2026-03-17 11:16:27.317336808 +0000 UTC m=+290.443464566" Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.361154 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-47pqd" podStartSLOduration=2.546252907 podStartE2EDuration="33.361135176s" podCreationTimestamp="2026-03-17 11:15:54 +0000 UTC" firstStartedPulling="2026-03-17 11:15:55.918947661 +0000 UTC m=+259.045075419" lastFinishedPulling="2026-03-17 11:16:26.73382993 +0000 UTC m=+289.859957688" observedRunningTime="2026-03-17 11:16:27.35718837 +0000 UTC m=+290.483316148" watchObservedRunningTime="2026-03-17 11:16:27.361135176 +0000 UTC m=+290.487262944" Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.593806 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562434-wtx87" Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.693966 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86f6s\" (UniqueName: \"kubernetes.io/projected/5b3a8612-a5db-4ec8-9873-32829e2fe69e-kube-api-access-86f6s\") pod \"5b3a8612-a5db-4ec8-9873-32829e2fe69e\" (UID: \"5b3a8612-a5db-4ec8-9873-32829e2fe69e\") " Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.704372 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3a8612-a5db-4ec8-9873-32829e2fe69e-kube-api-access-86f6s" (OuterVolumeSpecName: "kube-api-access-86f6s") pod "5b3a8612-a5db-4ec8-9873-32829e2fe69e" (UID: "5b3a8612-a5db-4ec8-9873-32829e2fe69e"). InnerVolumeSpecName "kube-api-access-86f6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.795246 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86f6s\" (UniqueName: \"kubernetes.io/projected/5b3a8612-a5db-4ec8-9873-32829e2fe69e-kube-api-access-86f6s\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.977403 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 17 11:16:27 crc kubenswrapper[4742]: E0317 11:16:27.977589 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3a8612-a5db-4ec8-9873-32829e2fe69e" containerName="oc" Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.977600 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3a8612-a5db-4ec8-9873-32829e2fe69e" containerName="oc" Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.977702 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3a8612-a5db-4ec8-9873-32829e2fe69e" containerName="oc" Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.978044 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.980138 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.980732 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 17 11:16:27 crc kubenswrapper[4742]: I0317 11:16:27.990452 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.101839 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cb35579-3ef0-4655-bc38-93bfa2397d48-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5cb35579-3ef0-4655-bc38-93bfa2397d48\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.102223 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cb35579-3ef0-4655-bc38-93bfa2397d48-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5cb35579-3ef0-4655-bc38-93bfa2397d48\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.203272 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cb35579-3ef0-4655-bc38-93bfa2397d48-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5cb35579-3ef0-4655-bc38-93bfa2397d48\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.203408 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cb35579-3ef0-4655-bc38-93bfa2397d48-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5cb35579-3ef0-4655-bc38-93bfa2397d48\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.203423 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cb35579-3ef0-4655-bc38-93bfa2397d48-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5cb35579-3ef0-4655-bc38-93bfa2397d48\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.220407 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cb35579-3ef0-4655-bc38-93bfa2397d48-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5cb35579-3ef0-4655-bc38-93bfa2397d48\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.294594 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.308859 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562434-wtx87" Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.309201 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562434-wtx87" event={"ID":"5b3a8612-a5db-4ec8-9873-32829e2fe69e","Type":"ContainerDied","Data":"f3af0eb7fcb7ccb9197f7f8bc761f0c9a1016569c7cbb5432dab238ad6daf9e7"} Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.309237 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3af0eb7fcb7ccb9197f7f8bc761f0c9a1016569c7cbb5432dab238ad6daf9e7" Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.571240 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562436-cnnrt" Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.712681 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bql4m\" (UniqueName: \"kubernetes.io/projected/f99ba73f-1688-43ea-9538-bc7623c02521-kube-api-access-bql4m\") pod \"f99ba73f-1688-43ea-9538-bc7623c02521\" (UID: \"f99ba73f-1688-43ea-9538-bc7623c02521\") " Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.720146 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99ba73f-1688-43ea-9538-bc7623c02521-kube-api-access-bql4m" (OuterVolumeSpecName: "kube-api-access-bql4m") pod "f99ba73f-1688-43ea-9538-bc7623c02521" (UID: "f99ba73f-1688-43ea-9538-bc7623c02521"). InnerVolumeSpecName "kube-api-access-bql4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.730755 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 17 11:16:28 crc kubenswrapper[4742]: I0317 11:16:28.814161 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bql4m\" (UniqueName: \"kubernetes.io/projected/f99ba73f-1688-43ea-9538-bc7623c02521-kube-api-access-bql4m\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.023933 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2"] Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.131513 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm"] Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.315127 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5cb35579-3ef0-4655-bc38-93bfa2397d48","Type":"ContainerStarted","Data":"ec43c714595c9bf9420f9bde803092f539b1004869d4346c597f8cdea0cdf11b"} Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.315170 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5cb35579-3ef0-4655-bc38-93bfa2397d48","Type":"ContainerStarted","Data":"2972daa943f26ddbd702a4d5fb26b106eef332cc5fda201c358dce0af7962578"} Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.316669 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562436-cnnrt" event={"ID":"f99ba73f-1688-43ea-9538-bc7623c02521","Type":"ContainerDied","Data":"d25bf9c191dd996e77c75db24ce845e13e57af907b3295ac7cd4cc655dec79c6"} Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.316731 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d25bf9c191dd996e77c75db24ce845e13e57af907b3295ac7cd4cc655dec79c6" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.316789 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" podUID="a8cff413-5a40-4a16-a24c-3c0f82377789" containerName="controller-manager" containerID="cri-o://276497ab2e60ca33d58f227c0f97b2fad0bbb15ed850771e37c05ac7f8fdea8b" gracePeriod=30 Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.316804 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562436-cnnrt" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.317003 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" podUID="777c45b3-3911-46ac-a314-db38bbed187a" containerName="route-controller-manager" containerID="cri-o://74992db9a05d1a47d681d3ec1ad6d0560e871725b20f6ca4704bfcdec0eb3952" gracePeriod=30 Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.328402 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.328383964 podStartE2EDuration="2.328383964s" podCreationTimestamp="2026-03-17 11:16:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:16:29.327166508 +0000 UTC m=+292.453294266" watchObservedRunningTime="2026-03-17 11:16:29.328383964 +0000 UTC m=+292.454511722" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.766867 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.770751 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.931035 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/777c45b3-3911-46ac-a314-db38bbed187a-client-ca\") pod \"777c45b3-3911-46ac-a314-db38bbed187a\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.931090 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/777c45b3-3911-46ac-a314-db38bbed187a-config\") pod \"777c45b3-3911-46ac-a314-db38bbed187a\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.931153 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8cff413-5a40-4a16-a24c-3c0f82377789-serving-cert\") pod \"a8cff413-5a40-4a16-a24c-3c0f82377789\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.931180 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn422\" (UniqueName: \"kubernetes.io/projected/777c45b3-3911-46ac-a314-db38bbed187a-kube-api-access-pn422\") pod \"777c45b3-3911-46ac-a314-db38bbed187a\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.931210 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-client-ca\") pod \"a8cff413-5a40-4a16-a24c-3c0f82377789\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.931255 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/777c45b3-3911-46ac-a314-db38bbed187a-serving-cert\") pod \"777c45b3-3911-46ac-a314-db38bbed187a\" (UID: \"777c45b3-3911-46ac-a314-db38bbed187a\") " Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.931278 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-config\") pod \"a8cff413-5a40-4a16-a24c-3c0f82377789\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.931335 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwswd\" (UniqueName: \"kubernetes.io/projected/a8cff413-5a40-4a16-a24c-3c0f82377789-kube-api-access-xwswd\") pod \"a8cff413-5a40-4a16-a24c-3c0f82377789\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.931382 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-proxy-ca-bundles\") pod \"a8cff413-5a40-4a16-a24c-3c0f82377789\" (UID: \"a8cff413-5a40-4a16-a24c-3c0f82377789\") " Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.931868 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/777c45b3-3911-46ac-a314-db38bbed187a-config" (OuterVolumeSpecName: "config") pod "777c45b3-3911-46ac-a314-db38bbed187a" (UID: "777c45b3-3911-46ac-a314-db38bbed187a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.931898 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/777c45b3-3911-46ac-a314-db38bbed187a-client-ca" (OuterVolumeSpecName: "client-ca") pod "777c45b3-3911-46ac-a314-db38bbed187a" (UID: "777c45b3-3911-46ac-a314-db38bbed187a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.932258 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-client-ca" (OuterVolumeSpecName: "client-ca") pod "a8cff413-5a40-4a16-a24c-3c0f82377789" (UID: "a8cff413-5a40-4a16-a24c-3c0f82377789"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.932378 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a8cff413-5a40-4a16-a24c-3c0f82377789" (UID: "a8cff413-5a40-4a16-a24c-3c0f82377789"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.932438 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-config" (OuterVolumeSpecName: "config") pod "a8cff413-5a40-4a16-a24c-3c0f82377789" (UID: "a8cff413-5a40-4a16-a24c-3c0f82377789"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.932753 4742 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.932778 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.932789 4742 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a8cff413-5a40-4a16-a24c-3c0f82377789-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.932802 4742 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/777c45b3-3911-46ac-a314-db38bbed187a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.932814 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/777c45b3-3911-46ac-a314-db38bbed187a-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.941157 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8cff413-5a40-4a16-a24c-3c0f82377789-kube-api-access-xwswd" (OuterVolumeSpecName: "kube-api-access-xwswd") pod "a8cff413-5a40-4a16-a24c-3c0f82377789" (UID: "a8cff413-5a40-4a16-a24c-3c0f82377789"). InnerVolumeSpecName "kube-api-access-xwswd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.941331 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/777c45b3-3911-46ac-a314-db38bbed187a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "777c45b3-3911-46ac-a314-db38bbed187a" (UID: "777c45b3-3911-46ac-a314-db38bbed187a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.941940 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8cff413-5a40-4a16-a24c-3c0f82377789-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a8cff413-5a40-4a16-a24c-3c0f82377789" (UID: "a8cff413-5a40-4a16-a24c-3c0f82377789"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:16:29 crc kubenswrapper[4742]: I0317 11:16:29.944415 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/777c45b3-3911-46ac-a314-db38bbed187a-kube-api-access-pn422" (OuterVolumeSpecName: "kube-api-access-pn422") pod "777c45b3-3911-46ac-a314-db38bbed187a" (UID: "777c45b3-3911-46ac-a314-db38bbed187a"). InnerVolumeSpecName "kube-api-access-pn422". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.033866 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/777c45b3-3911-46ac-a314-db38bbed187a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.033899 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwswd\" (UniqueName: \"kubernetes.io/projected/a8cff413-5a40-4a16-a24c-3c0f82377789-kube-api-access-xwswd\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.033924 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8cff413-5a40-4a16-a24c-3c0f82377789-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.033932 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn422\" (UniqueName: \"kubernetes.io/projected/777c45b3-3911-46ac-a314-db38bbed187a-kube-api-access-pn422\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.331207 4742 generic.go:334] "Generic (PLEG): container finished" podID="5cb35579-3ef0-4655-bc38-93bfa2397d48" containerID="ec43c714595c9bf9420f9bde803092f539b1004869d4346c597f8cdea0cdf11b" exitCode=0 Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.331279 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5cb35579-3ef0-4655-bc38-93bfa2397d48","Type":"ContainerDied","Data":"ec43c714595c9bf9420f9bde803092f539b1004869d4346c597f8cdea0cdf11b"} Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.332744 4742 generic.go:334] "Generic (PLEG): container finished" podID="777c45b3-3911-46ac-a314-db38bbed187a" containerID="74992db9a05d1a47d681d3ec1ad6d0560e871725b20f6ca4704bfcdec0eb3952" exitCode=0 Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.332809 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" event={"ID":"777c45b3-3911-46ac-a314-db38bbed187a","Type":"ContainerDied","Data":"74992db9a05d1a47d681d3ec1ad6d0560e871725b20f6ca4704bfcdec0eb3952"} Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.332788 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.332853 4742 scope.go:117] "RemoveContainer" containerID="74992db9a05d1a47d681d3ec1ad6d0560e871725b20f6ca4704bfcdec0eb3952" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.332842 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm" event={"ID":"777c45b3-3911-46ac-a314-db38bbed187a","Type":"ContainerDied","Data":"c84fd19a66fc33c0dc60d4879e6a358eded1a534f2e1497e373c82d53efcdc97"} Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.334478 4742 generic.go:334] "Generic (PLEG): container finished" podID="a8cff413-5a40-4a16-a24c-3c0f82377789" containerID="276497ab2e60ca33d58f227c0f97b2fad0bbb15ed850771e37c05ac7f8fdea8b" exitCode=0 Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.334522 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" event={"ID":"a8cff413-5a40-4a16-a24c-3c0f82377789","Type":"ContainerDied","Data":"276497ab2e60ca33d58f227c0f97b2fad0bbb15ed850771e37c05ac7f8fdea8b"} Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.334547 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" event={"ID":"a8cff413-5a40-4a16-a24c-3c0f82377789","Type":"ContainerDied","Data":"b965bc22e86b79dbf4e5ab99419b0bdeef15abedcae973f86171c3c2e15e0b6a"} Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.334595 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.353340 4742 scope.go:117] "RemoveContainer" containerID="74992db9a05d1a47d681d3ec1ad6d0560e871725b20f6ca4704bfcdec0eb3952" Mar 17 11:16:30 crc kubenswrapper[4742]: E0317 11:16:30.354411 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74992db9a05d1a47d681d3ec1ad6d0560e871725b20f6ca4704bfcdec0eb3952\": container with ID starting with 74992db9a05d1a47d681d3ec1ad6d0560e871725b20f6ca4704bfcdec0eb3952 not found: ID does not exist" containerID="74992db9a05d1a47d681d3ec1ad6d0560e871725b20f6ca4704bfcdec0eb3952" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.354450 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74992db9a05d1a47d681d3ec1ad6d0560e871725b20f6ca4704bfcdec0eb3952"} err="failed to get container status \"74992db9a05d1a47d681d3ec1ad6d0560e871725b20f6ca4704bfcdec0eb3952\": rpc error: code = NotFound desc = could not find container \"74992db9a05d1a47d681d3ec1ad6d0560e871725b20f6ca4704bfcdec0eb3952\": container with ID starting with 74992db9a05d1a47d681d3ec1ad6d0560e871725b20f6ca4704bfcdec0eb3952 not found: ID does not exist" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.354475 4742 scope.go:117] "RemoveContainer" containerID="276497ab2e60ca33d58f227c0f97b2fad0bbb15ed850771e37c05ac7f8fdea8b" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.369825 4742 scope.go:117] "RemoveContainer" containerID="276497ab2e60ca33d58f227c0f97b2fad0bbb15ed850771e37c05ac7f8fdea8b" Mar 17 11:16:30 crc kubenswrapper[4742]: E0317 11:16:30.370961 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"276497ab2e60ca33d58f227c0f97b2fad0bbb15ed850771e37c05ac7f8fdea8b\": container with ID starting with 276497ab2e60ca33d58f227c0f97b2fad0bbb15ed850771e37c05ac7f8fdea8b not found: ID does not exist" containerID="276497ab2e60ca33d58f227c0f97b2fad0bbb15ed850771e37c05ac7f8fdea8b" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.371015 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"276497ab2e60ca33d58f227c0f97b2fad0bbb15ed850771e37c05ac7f8fdea8b"} err="failed to get container status \"276497ab2e60ca33d58f227c0f97b2fad0bbb15ed850771e37c05ac7f8fdea8b\": rpc error: code = NotFound desc = could not find container \"276497ab2e60ca33d58f227c0f97b2fad0bbb15ed850771e37c05ac7f8fdea8b\": container with ID starting with 276497ab2e60ca33d58f227c0f97b2fad0bbb15ed850771e37c05ac7f8fdea8b not found: ID does not exist" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.372596 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm"] Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.380361 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dcbcd868-d8stm"] Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.388881 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2"] Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.391595 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54d6b5c6f8-gr8l2"] Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.443085 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7f7c985957-t8psd"] Mar 17 11:16:30 crc kubenswrapper[4742]: E0317 11:16:30.443355 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777c45b3-3911-46ac-a314-db38bbed187a" containerName="route-controller-manager" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.443449 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="777c45b3-3911-46ac-a314-db38bbed187a" containerName="route-controller-manager" Mar 17 11:16:30 crc kubenswrapper[4742]: E0317 11:16:30.443470 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f99ba73f-1688-43ea-9538-bc7623c02521" containerName="oc" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.443476 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99ba73f-1688-43ea-9538-bc7623c02521" containerName="oc" Mar 17 11:16:30 crc kubenswrapper[4742]: E0317 11:16:30.443488 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8cff413-5a40-4a16-a24c-3c0f82377789" containerName="controller-manager" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.443494 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cff413-5a40-4a16-a24c-3c0f82377789" containerName="controller-manager" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.443616 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f99ba73f-1688-43ea-9538-bc7623c02521" containerName="oc" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.443631 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="777c45b3-3911-46ac-a314-db38bbed187a" containerName="route-controller-manager" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.443641 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8cff413-5a40-4a16-a24c-3c0f82377789" containerName="controller-manager" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.444537 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.446871 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.447023 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.446879 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.447265 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.447388 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.447478 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.448207 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j"] Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.449063 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.452357 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.452803 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.452922 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.452818 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.453081 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.454104 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.459044 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.465523 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j"] Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.471041 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f7c985957-t8psd"] Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.543852 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-client-ca\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.543917 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28dbb22f-82bf-40db-8de3-1473d2aea429-client-ca\") pod \"route-controller-manager-57c9cf9d98-7t26j\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.543946 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-config\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.543993 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-proxy-ca-bundles\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.544047 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/479f957a-0e3f-42e0-a6b2-05d1fa30984f-serving-cert\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.544065 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t28h\" (UniqueName: \"kubernetes.io/projected/28dbb22f-82bf-40db-8de3-1473d2aea429-kube-api-access-2t28h\") pod \"route-controller-manager-57c9cf9d98-7t26j\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.544090 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28dbb22f-82bf-40db-8de3-1473d2aea429-config\") pod \"route-controller-manager-57c9cf9d98-7t26j\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.544129 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28dbb22f-82bf-40db-8de3-1473d2aea429-serving-cert\") pod \"route-controller-manager-57c9cf9d98-7t26j\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.544146 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdbvl\" (UniqueName: \"kubernetes.io/projected/479f957a-0e3f-42e0-a6b2-05d1fa30984f-kube-api-access-pdbvl\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.645219 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/479f957a-0e3f-42e0-a6b2-05d1fa30984f-serving-cert\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.645485 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t28h\" (UniqueName: \"kubernetes.io/projected/28dbb22f-82bf-40db-8de3-1473d2aea429-kube-api-access-2t28h\") pod \"route-controller-manager-57c9cf9d98-7t26j\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.645507 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28dbb22f-82bf-40db-8de3-1473d2aea429-config\") pod \"route-controller-manager-57c9cf9d98-7t26j\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.645546 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28dbb22f-82bf-40db-8de3-1473d2aea429-serving-cert\") pod \"route-controller-manager-57c9cf9d98-7t26j\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.645561 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdbvl\" (UniqueName: \"kubernetes.io/projected/479f957a-0e3f-42e0-a6b2-05d1fa30984f-kube-api-access-pdbvl\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.645588 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-client-ca\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.645605 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28dbb22f-82bf-40db-8de3-1473d2aea429-client-ca\") pod \"route-controller-manager-57c9cf9d98-7t26j\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.645663 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-config\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.645702 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-proxy-ca-bundles\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.646721 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28dbb22f-82bf-40db-8de3-1473d2aea429-client-ca\") pod \"route-controller-manager-57c9cf9d98-7t26j\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.646888 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-proxy-ca-bundles\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.646885 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-config\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.647198 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28dbb22f-82bf-40db-8de3-1473d2aea429-config\") pod \"route-controller-manager-57c9cf9d98-7t26j\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.647371 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-client-ca\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.652475 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28dbb22f-82bf-40db-8de3-1473d2aea429-serving-cert\") pod \"route-controller-manager-57c9cf9d98-7t26j\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.656352 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/479f957a-0e3f-42e0-a6b2-05d1fa30984f-serving-cert\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.667272 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t28h\" (UniqueName: \"kubernetes.io/projected/28dbb22f-82bf-40db-8de3-1473d2aea429-kube-api-access-2t28h\") pod \"route-controller-manager-57c9cf9d98-7t26j\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.668842 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="777c45b3-3911-46ac-a314-db38bbed187a" path="/var/lib/kubelet/pods/777c45b3-3911-46ac-a314-db38bbed187a/volumes" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.669347 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8cff413-5a40-4a16-a24c-3c0f82377789" path="/var/lib/kubelet/pods/a8cff413-5a40-4a16-a24c-3c0f82377789/volumes" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.673953 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdbvl\" (UniqueName: \"kubernetes.io/projected/479f957a-0e3f-42e0-a6b2-05d1fa30984f-kube-api-access-pdbvl\") pod \"controller-manager-7f7c985957-t8psd\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.774835 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:30 crc kubenswrapper[4742]: I0317 11:16:30.791233 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:31 crc kubenswrapper[4742]: I0317 11:16:31.069991 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7f7c985957-t8psd"] Mar 17 11:16:31 crc kubenswrapper[4742]: I0317 11:16:31.314122 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j"] Mar 17 11:16:31 crc kubenswrapper[4742]: W0317 11:16:31.316716 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28dbb22f_82bf_40db_8de3_1473d2aea429.slice/crio-c87c61152fbe0a03c81487a773e344393bac6a64c5768edde6807de7d7d5272f WatchSource:0}: Error finding container c87c61152fbe0a03c81487a773e344393bac6a64c5768edde6807de7d7d5272f: Status 404 returned error can't find the container with id c87c61152fbe0a03c81487a773e344393bac6a64c5768edde6807de7d7d5272f Mar 17 11:16:31 crc kubenswrapper[4742]: I0317 11:16:31.338976 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" event={"ID":"28dbb22f-82bf-40db-8de3-1473d2aea429","Type":"ContainerStarted","Data":"c87c61152fbe0a03c81487a773e344393bac6a64c5768edde6807de7d7d5272f"} Mar 17 11:16:31 crc kubenswrapper[4742]: I0317 11:16:31.341090 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" event={"ID":"479f957a-0e3f-42e0-a6b2-05d1fa30984f","Type":"ContainerStarted","Data":"d2a702a3b2e39e23f71d919bac5b9381266361f9ee72045fb7d1d4df74ab23b5"} Mar 17 11:16:31 crc kubenswrapper[4742]: I0317 11:16:31.586254 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 11:16:31 crc kubenswrapper[4742]: I0317 11:16:31.669828 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cb35579-3ef0-4655-bc38-93bfa2397d48-kubelet-dir\") pod \"5cb35579-3ef0-4655-bc38-93bfa2397d48\" (UID: \"5cb35579-3ef0-4655-bc38-93bfa2397d48\") " Mar 17 11:16:31 crc kubenswrapper[4742]: I0317 11:16:31.670455 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cb35579-3ef0-4655-bc38-93bfa2397d48-kube-api-access\") pod \"5cb35579-3ef0-4655-bc38-93bfa2397d48\" (UID: \"5cb35579-3ef0-4655-bc38-93bfa2397d48\") " Mar 17 11:16:31 crc kubenswrapper[4742]: I0317 11:16:31.669988 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5cb35579-3ef0-4655-bc38-93bfa2397d48-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5cb35579-3ef0-4655-bc38-93bfa2397d48" (UID: "5cb35579-3ef0-4655-bc38-93bfa2397d48"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:16:31 crc kubenswrapper[4742]: I0317 11:16:31.670899 4742 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cb35579-3ef0-4655-bc38-93bfa2397d48-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:31 crc kubenswrapper[4742]: I0317 11:16:31.675121 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cb35579-3ef0-4655-bc38-93bfa2397d48-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5cb35579-3ef0-4655-bc38-93bfa2397d48" (UID: "5cb35579-3ef0-4655-bc38-93bfa2397d48"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:31 crc kubenswrapper[4742]: I0317 11:16:31.771711 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cb35579-3ef0-4655-bc38-93bfa2397d48-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:32 crc kubenswrapper[4742]: I0317 11:16:32.348180 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" event={"ID":"28dbb22f-82bf-40db-8de3-1473d2aea429","Type":"ContainerStarted","Data":"855ae5474342446d428cfefc5b07d24c9d2ba27ba251bc9cb6e2412898b11865"} Mar 17 11:16:32 crc kubenswrapper[4742]: I0317 11:16:32.348927 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:32 crc kubenswrapper[4742]: I0317 11:16:32.351188 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5cb35579-3ef0-4655-bc38-93bfa2397d48","Type":"ContainerDied","Data":"2972daa943f26ddbd702a4d5fb26b106eef332cc5fda201c358dce0af7962578"} Mar 17 11:16:32 crc kubenswrapper[4742]: I0317 11:16:32.351245 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2972daa943f26ddbd702a4d5fb26b106eef332cc5fda201c358dce0af7962578" Mar 17 11:16:32 crc kubenswrapper[4742]: I0317 11:16:32.351212 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 17 11:16:32 crc kubenswrapper[4742]: I0317 11:16:32.352498 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" event={"ID":"479f957a-0e3f-42e0-a6b2-05d1fa30984f","Type":"ContainerStarted","Data":"86d22f30e5e83ae4c5d4dca67e2ed6b52f85e69b919eeff9fed5260677cd6b8c"} Mar 17 11:16:32 crc kubenswrapper[4742]: I0317 11:16:32.368555 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" podStartSLOduration=3.368534559 podStartE2EDuration="3.368534559s" podCreationTimestamp="2026-03-17 11:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:16:32.367282063 +0000 UTC m=+295.493409831" watchObservedRunningTime="2026-03-17 11:16:32.368534559 +0000 UTC m=+295.494662317" Mar 17 11:16:32 crc kubenswrapper[4742]: I0317 11:16:32.389104 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" podStartSLOduration=3.389081884 podStartE2EDuration="3.389081884s" podCreationTimestamp="2026-03-17 11:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:16:32.385034885 +0000 UTC m=+295.511162643" watchObservedRunningTime="2026-03-17 11:16:32.389081884 +0000 UTC m=+295.515209642" Mar 17 11:16:32 crc kubenswrapper[4742]: I0317 11:16:32.467891 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:33 crc kubenswrapper[4742]: I0317 11:16:33.364055 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:33 crc kubenswrapper[4742]: I0317 11:16:33.367886 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:33 crc kubenswrapper[4742]: I0317 11:16:33.979337 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 17 11:16:33 crc kubenswrapper[4742]: E0317 11:16:33.981345 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cb35579-3ef0-4655-bc38-93bfa2397d48" containerName="pruner" Mar 17 11:16:33 crc kubenswrapper[4742]: I0317 11:16:33.981719 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cb35579-3ef0-4655-bc38-93bfa2397d48" containerName="pruner" Mar 17 11:16:33 crc kubenswrapper[4742]: I0317 11:16:33.982031 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cb35579-3ef0-4655-bc38-93bfa2397d48" containerName="pruner" Mar 17 11:16:33 crc kubenswrapper[4742]: I0317 11:16:33.984892 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 17 11:16:33 crc kubenswrapper[4742]: I0317 11:16:33.982480 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 17 11:16:33 crc kubenswrapper[4742]: I0317 11:16:33.987044 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 17 11:16:33 crc kubenswrapper[4742]: I0317 11:16:33.987261 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.001686 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-kube-api-access\") pod \"installer-9-crc\" (UID: \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.002164 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.002265 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-var-lock\") pod \"installer-9-crc\" (UID: \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.103652 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-kube-api-access\") pod \"installer-9-crc\" (UID: \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.103728 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.103786 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-var-lock\") pod \"installer-9-crc\" (UID: \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.103868 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-var-lock\") pod \"installer-9-crc\" (UID: \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.103889 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.121244 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-kube-api-access\") pod \"installer-9-crc\" (UID: \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.301162 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.645348 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.645726 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.734487 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 17 11:16:34 crc kubenswrapper[4742]: W0317 11:16:34.736842 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0fcbfeb4_d9c8_4004_b86e_191d6c8e12c6.slice/crio-62fc9caf74e9cf002e89804c637b030227182227b79e2dcd33987b804d3783eb WatchSource:0}: Error finding container 62fc9caf74e9cf002e89804c637b030227182227b79e2dcd33987b804d3783eb: Status 404 returned error can't find the container with id 62fc9caf74e9cf002e89804c637b030227182227b79e2dcd33987b804d3783eb Mar 17 11:16:34 crc kubenswrapper[4742]: I0317 11:16:34.903363 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:16:35 crc kubenswrapper[4742]: I0317 11:16:35.060218 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:16:35 crc kubenswrapper[4742]: I0317 11:16:35.060710 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:16:35 crc kubenswrapper[4742]: I0317 11:16:35.110537 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:16:35 crc kubenswrapper[4742]: I0317 11:16:35.391714 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6","Type":"ContainerStarted","Data":"748188c35812e90d9fef199e90c431a40e09dc09f4e13ae6bc0fe6f6b755c967"} Mar 17 11:16:35 crc kubenswrapper[4742]: I0317 11:16:35.391774 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6","Type":"ContainerStarted","Data":"62fc9caf74e9cf002e89804c637b030227182227b79e2dcd33987b804d3783eb"} Mar 17 11:16:35 crc kubenswrapper[4742]: I0317 11:16:35.414223 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.4142066079999998 podStartE2EDuration="2.414206608s" podCreationTimestamp="2026-03-17 11:16:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:16:35.407024807 +0000 UTC m=+298.533152565" watchObservedRunningTime="2026-03-17 11:16:35.414206608 +0000 UTC m=+298.540334366" Mar 17 11:16:35 crc kubenswrapper[4742]: I0317 11:16:35.428998 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:16:35 crc kubenswrapper[4742]: I0317 11:16:35.440744 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:16:36 crc kubenswrapper[4742]: I0317 11:16:36.128083 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f948n"] Mar 17 11:16:37 crc kubenswrapper[4742]: I0317 11:16:37.403359 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f948n" podUID="f02c3898-7b15-4f51-a0ac-45a077355791" containerName="registry-server" containerID="cri-o://47b6db3c9c5b4925825d5ce91fd9247083e8db6f3c1874f8c8c7520207d192e5" gracePeriod=2 Mar 17 11:16:38 crc kubenswrapper[4742]: I0317 11:16:38.413545 4742 generic.go:334] "Generic (PLEG): container finished" podID="f02c3898-7b15-4f51-a0ac-45a077355791" containerID="47b6db3c9c5b4925825d5ce91fd9247083e8db6f3c1874f8c8c7520207d192e5" exitCode=0 Mar 17 11:16:38 crc kubenswrapper[4742]: I0317 11:16:38.413605 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f948n" event={"ID":"f02c3898-7b15-4f51-a0ac-45a077355791","Type":"ContainerDied","Data":"47b6db3c9c5b4925825d5ce91fd9247083e8db6f3c1874f8c8c7520207d192e5"} Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.283386 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.370653 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp7v8\" (UniqueName: \"kubernetes.io/projected/f02c3898-7b15-4f51-a0ac-45a077355791-kube-api-access-wp7v8\") pod \"f02c3898-7b15-4f51-a0ac-45a077355791\" (UID: \"f02c3898-7b15-4f51-a0ac-45a077355791\") " Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.370707 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f02c3898-7b15-4f51-a0ac-45a077355791-utilities\") pod \"f02c3898-7b15-4f51-a0ac-45a077355791\" (UID: \"f02c3898-7b15-4f51-a0ac-45a077355791\") " Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.371005 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f02c3898-7b15-4f51-a0ac-45a077355791-catalog-content\") pod \"f02c3898-7b15-4f51-a0ac-45a077355791\" (UID: \"f02c3898-7b15-4f51-a0ac-45a077355791\") " Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.371489 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f02c3898-7b15-4f51-a0ac-45a077355791-utilities" (OuterVolumeSpecName: "utilities") pod "f02c3898-7b15-4f51-a0ac-45a077355791" (UID: "f02c3898-7b15-4f51-a0ac-45a077355791"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.372523 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f02c3898-7b15-4f51-a0ac-45a077355791-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.376394 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f02c3898-7b15-4f51-a0ac-45a077355791-kube-api-access-wp7v8" (OuterVolumeSpecName: "kube-api-access-wp7v8") pod "f02c3898-7b15-4f51-a0ac-45a077355791" (UID: "f02c3898-7b15-4f51-a0ac-45a077355791"). InnerVolumeSpecName "kube-api-access-wp7v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.420155 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f948n" event={"ID":"f02c3898-7b15-4f51-a0ac-45a077355791","Type":"ContainerDied","Data":"1bf6dbad41f7201f3167ca65e63d5ef269bfbf4cb39e3f47c6a2156f060da085"} Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.420214 4742 scope.go:117] "RemoveContainer" containerID="47b6db3c9c5b4925825d5ce91fd9247083e8db6f3c1874f8c8c7520207d192e5" Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.420215 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f948n" Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.441163 4742 scope.go:117] "RemoveContainer" containerID="259c9959b88f99453c51da52b7903d287d34ddf059e5278610cc794fb5eb363c" Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.474470 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp7v8\" (UniqueName: \"kubernetes.io/projected/f02c3898-7b15-4f51-a0ac-45a077355791-kube-api-access-wp7v8\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.477398 4742 scope.go:117] "RemoveContainer" containerID="5eab7d7ed9f56d14ec449087fda6347e692d36ff14a0c81de1a7cf14f289f796" Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.505299 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f02c3898-7b15-4f51-a0ac-45a077355791-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f02c3898-7b15-4f51-a0ac-45a077355791" (UID: "f02c3898-7b15-4f51-a0ac-45a077355791"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.576258 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f02c3898-7b15-4f51-a0ac-45a077355791-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.748470 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f948n"] Mar 17 11:16:39 crc kubenswrapper[4742]: I0317 11:16:39.756847 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f948n"] Mar 17 11:16:40 crc kubenswrapper[4742]: I0317 11:16:40.430126 4742 generic.go:334] "Generic (PLEG): container finished" podID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" containerID="c3d3f4713c444f242c6ffb4f4d76e1e7bddaece8ab74469f733e981aca89b4e9" exitCode=0 Mar 17 11:16:40 crc kubenswrapper[4742]: I0317 11:16:40.430164 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p27vh" event={"ID":"ce3a51df-d6e4-46d5-95e3-8be6aaba196f","Type":"ContainerDied","Data":"c3d3f4713c444f242c6ffb4f4d76e1e7bddaece8ab74469f733e981aca89b4e9"} Mar 17 11:16:40 crc kubenswrapper[4742]: I0317 11:16:40.673778 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f02c3898-7b15-4f51-a0ac-45a077355791" path="/var/lib/kubelet/pods/f02c3898-7b15-4f51-a0ac-45a077355791/volumes" Mar 17 11:16:41 crc kubenswrapper[4742]: I0317 11:16:41.437171 4742 generic.go:334] "Generic (PLEG): container finished" podID="e0543787-88e8-463d-b01b-694ecb854bfa" containerID="cdcf34cff56f54047d55cef192e5b039a333bf35ae0b417a5691100bee627c83" exitCode=0 Mar 17 11:16:41 crc kubenswrapper[4742]: I0317 11:16:41.437274 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6ffv" event={"ID":"e0543787-88e8-463d-b01b-694ecb854bfa","Type":"ContainerDied","Data":"cdcf34cff56f54047d55cef192e5b039a333bf35ae0b417a5691100bee627c83"} Mar 17 11:16:41 crc kubenswrapper[4742]: I0317 11:16:41.440662 4742 generic.go:334] "Generic (PLEG): container finished" podID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" containerID="6ed674a64a5dc5b434ad8ef82af06dd2fc6ac4452025080af4763f35894c9c9e" exitCode=0 Mar 17 11:16:41 crc kubenswrapper[4742]: I0317 11:16:41.440734 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mvgb" event={"ID":"d416e1fd-2137-48a4-b933-b25a9ca94a8a","Type":"ContainerDied","Data":"6ed674a64a5dc5b434ad8ef82af06dd2fc6ac4452025080af4763f35894c9c9e"} Mar 17 11:16:41 crc kubenswrapper[4742]: I0317 11:16:41.444052 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p27vh" event={"ID":"ce3a51df-d6e4-46d5-95e3-8be6aaba196f","Type":"ContainerStarted","Data":"17a2045c0843315f80387afe38ad29abf88b8c06659c4014ee2a76740605e7c6"} Mar 17 11:16:41 crc kubenswrapper[4742]: I0317 11:16:41.446181 4742 generic.go:334] "Generic (PLEG): container finished" podID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" containerID="b82748e7724c6322f05fe1c0e570dbc35567a479e20975a5a9cfed7d12958648" exitCode=0 Mar 17 11:16:41 crc kubenswrapper[4742]: I0317 11:16:41.446217 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cnzgk" event={"ID":"4b80c435-0e24-4ab2-980c-f2dfb1baef87","Type":"ContainerDied","Data":"b82748e7724c6322f05fe1c0e570dbc35567a479e20975a5a9cfed7d12958648"} Mar 17 11:16:41 crc kubenswrapper[4742]: I0317 11:16:41.452009 4742 generic.go:334] "Generic (PLEG): container finished" podID="c8e4be20-e918-45d7-b026-12ef2abf3462" containerID="87a4c635aa94ef01f09f17a7f79683196bcf114d0c4bac75218e53183a620263" exitCode=0 Mar 17 11:16:41 crc kubenswrapper[4742]: I0317 11:16:41.452078 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzdpj" event={"ID":"c8e4be20-e918-45d7-b026-12ef2abf3462","Type":"ContainerDied","Data":"87a4c635aa94ef01f09f17a7f79683196bcf114d0c4bac75218e53183a620263"} Mar 17 11:16:41 crc kubenswrapper[4742]: I0317 11:16:41.454969 4742 generic.go:334] "Generic (PLEG): container finished" podID="72e6f877-4431-46ba-8c22-0479a383851b" containerID="ab31dfd2527d00b4290f7c0d2193daf9625e62861189e20d7225985e0570976c" exitCode=0 Mar 17 11:16:41 crc kubenswrapper[4742]: I0317 11:16:41.455017 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v4hw" event={"ID":"72e6f877-4431-46ba-8c22-0479a383851b","Type":"ContainerDied","Data":"ab31dfd2527d00b4290f7c0d2193daf9625e62861189e20d7225985e0570976c"} Mar 17 11:16:41 crc kubenswrapper[4742]: I0317 11:16:41.516474 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p27vh" podStartSLOduration=2.490514202 podStartE2EDuration="48.516457267s" podCreationTimestamp="2026-03-17 11:15:53 +0000 UTC" firstStartedPulling="2026-03-17 11:15:54.866192454 +0000 UTC m=+257.992320212" lastFinishedPulling="2026-03-17 11:16:40.892135519 +0000 UTC m=+304.018263277" observedRunningTime="2026-03-17 11:16:41.514930882 +0000 UTC m=+304.641058640" watchObservedRunningTime="2026-03-17 11:16:41.516457267 +0000 UTC m=+304.642585025" Mar 17 11:16:42 crc kubenswrapper[4742]: I0317 11:16:42.462174 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cnzgk" event={"ID":"4b80c435-0e24-4ab2-980c-f2dfb1baef87","Type":"ContainerStarted","Data":"69bfcbe9d9a000afa411d70603f541dc847747a1f3a53f3401ad9723ce234031"} Mar 17 11:16:42 crc kubenswrapper[4742]: I0317 11:16:42.464332 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzdpj" event={"ID":"c8e4be20-e918-45d7-b026-12ef2abf3462","Type":"ContainerStarted","Data":"3215c894254e2324910b80266291f0d92c4ac5bb237be83b41b750a846978808"} Mar 17 11:16:42 crc kubenswrapper[4742]: I0317 11:16:42.466112 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v4hw" event={"ID":"72e6f877-4431-46ba-8c22-0479a383851b","Type":"ContainerStarted","Data":"bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963"} Mar 17 11:16:42 crc kubenswrapper[4742]: I0317 11:16:42.468818 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6ffv" event={"ID":"e0543787-88e8-463d-b01b-694ecb854bfa","Type":"ContainerStarted","Data":"0584db02d32479030b8788b3be5bf68ebb3fd552ce14569ce6c7774598f6806b"} Mar 17 11:16:42 crc kubenswrapper[4742]: I0317 11:16:42.470686 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mvgb" event={"ID":"d416e1fd-2137-48a4-b933-b25a9ca94a8a","Type":"ContainerStarted","Data":"180c4598b6b78e487454c88e69b19b12636da56e0db520083d00fcd0cd4efd57"} Mar 17 11:16:42 crc kubenswrapper[4742]: I0317 11:16:42.477810 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cnzgk" podStartSLOduration=3.316879401 podStartE2EDuration="51.477795625s" podCreationTimestamp="2026-03-17 11:15:51 +0000 UTC" firstStartedPulling="2026-03-17 11:15:53.758193673 +0000 UTC m=+256.884321431" lastFinishedPulling="2026-03-17 11:16:41.919109897 +0000 UTC m=+305.045237655" observedRunningTime="2026-03-17 11:16:42.477354213 +0000 UTC m=+305.603481971" watchObservedRunningTime="2026-03-17 11:16:42.477795625 +0000 UTC m=+305.603923383" Mar 17 11:16:42 crc kubenswrapper[4742]: I0317 11:16:42.500993 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qzdpj" podStartSLOduration=3.090005514 podStartE2EDuration="51.500976368s" podCreationTimestamp="2026-03-17 11:15:51 +0000 UTC" firstStartedPulling="2026-03-17 11:15:53.769244192 +0000 UTC m=+256.895371950" lastFinishedPulling="2026-03-17 11:16:42.180215046 +0000 UTC m=+305.306342804" observedRunningTime="2026-03-17 11:16:42.497775283 +0000 UTC m=+305.623903081" watchObservedRunningTime="2026-03-17 11:16:42.500976368 +0000 UTC m=+305.627104126" Mar 17 11:16:42 crc kubenswrapper[4742]: I0317 11:16:42.515760 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6mvgb" podStartSLOduration=2.451433304 podStartE2EDuration="49.515743691s" podCreationTimestamp="2026-03-17 11:15:53 +0000 UTC" firstStartedPulling="2026-03-17 11:15:54.865403512 +0000 UTC m=+257.991531270" lastFinishedPulling="2026-03-17 11:16:41.929713899 +0000 UTC m=+305.055841657" observedRunningTime="2026-03-17 11:16:42.515272188 +0000 UTC m=+305.641399956" watchObservedRunningTime="2026-03-17 11:16:42.515743691 +0000 UTC m=+305.641871449" Mar 17 11:16:42 crc kubenswrapper[4742]: I0317 11:16:42.542442 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5v4hw" podStartSLOduration=3.45995188 podStartE2EDuration="51.542424136s" podCreationTimestamp="2026-03-17 11:15:51 +0000 UTC" firstStartedPulling="2026-03-17 11:15:53.773895366 +0000 UTC m=+256.900023124" lastFinishedPulling="2026-03-17 11:16:41.856367622 +0000 UTC m=+304.982495380" observedRunningTime="2026-03-17 11:16:42.540104718 +0000 UTC m=+305.666232486" watchObservedRunningTime="2026-03-17 11:16:42.542424136 +0000 UTC m=+305.668551894" Mar 17 11:16:42 crc kubenswrapper[4742]: I0317 11:16:42.561126 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x6ffv" podStartSLOduration=3.531813376 podStartE2EDuration="51.561111646s" podCreationTimestamp="2026-03-17 11:15:51 +0000 UTC" firstStartedPulling="2026-03-17 11:15:53.808182527 +0000 UTC m=+256.934310285" lastFinishedPulling="2026-03-17 11:16:41.837480797 +0000 UTC m=+304.963608555" observedRunningTime="2026-03-17 11:16:42.557310034 +0000 UTC m=+305.683437792" watchObservedRunningTime="2026-03-17 11:16:42.561111646 +0000 UTC m=+305.687239404" Mar 17 11:16:43 crc kubenswrapper[4742]: I0317 11:16:43.634937 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:16:43 crc kubenswrapper[4742]: I0317 11:16:43.635000 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:16:43 crc kubenswrapper[4742]: I0317 11:16:43.675284 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:16:44 crc kubenswrapper[4742]: I0317 11:16:44.045150 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:16:44 crc kubenswrapper[4742]: I0317 11:16:44.045197 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:16:44 crc kubenswrapper[4742]: I0317 11:16:44.046656 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-spkdx"] Mar 17 11:16:45 crc kubenswrapper[4742]: I0317 11:16:45.140233 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-6mvgb" podUID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" containerName="registry-server" probeResult="failure" output=< Mar 17 11:16:45 crc kubenswrapper[4742]: timeout: failed to connect service ":50051" within 1s Mar 17 11:16:45 crc kubenswrapper[4742]: > Mar 17 11:16:48 crc kubenswrapper[4742]: I0317 11:16:48.044154 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:16:48 crc kubenswrapper[4742]: I0317 11:16:48.044506 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:16:48 crc kubenswrapper[4742]: I0317 11:16:48.044558 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:16:48 crc kubenswrapper[4742]: I0317 11:16:48.045226 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 11:16:48 crc kubenswrapper[4742]: I0317 11:16:48.045304 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092" gracePeriod=600 Mar 17 11:16:48 crc kubenswrapper[4742]: I0317 11:16:48.508341 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092" exitCode=0 Mar 17 11:16:48 crc kubenswrapper[4742]: I0317 11:16:48.508416 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092"} Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.086557 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j"] Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.087240 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" podUID="28dbb22f-82bf-40db-8de3-1473d2aea429" containerName="route-controller-manager" containerID="cri-o://855ae5474342446d428cfefc5b07d24c9d2ba27ba251bc9cb6e2412898b11865" gracePeriod=30 Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.090783 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f7c985957-t8psd"] Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.091094 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" podUID="479f957a-0e3f-42e0-a6b2-05d1fa30984f" containerName="controller-manager" containerID="cri-o://86d22f30e5e83ae4c5d4dca67e2ed6b52f85e69b919eeff9fed5260677cd6b8c" gracePeriod=30 Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.521517 4742 generic.go:334] "Generic (PLEG): container finished" podID="28dbb22f-82bf-40db-8de3-1473d2aea429" containerID="855ae5474342446d428cfefc5b07d24c9d2ba27ba251bc9cb6e2412898b11865" exitCode=0 Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.521564 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" event={"ID":"28dbb22f-82bf-40db-8de3-1473d2aea429","Type":"ContainerDied","Data":"855ae5474342446d428cfefc5b07d24c9d2ba27ba251bc9cb6e2412898b11865"} Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.543109 4742 generic.go:334] "Generic (PLEG): container finished" podID="479f957a-0e3f-42e0-a6b2-05d1fa30984f" containerID="86d22f30e5e83ae4c5d4dca67e2ed6b52f85e69b919eeff9fed5260677cd6b8c" exitCode=0 Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.543191 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" event={"ID":"479f957a-0e3f-42e0-a6b2-05d1fa30984f","Type":"ContainerDied","Data":"86d22f30e5e83ae4c5d4dca67e2ed6b52f85e69b919eeff9fed5260677cd6b8c"} Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.552477 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"d359fe986da000baf4416f14e6a6add8b7b7042aba869fb27193d50f2884a38b"} Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.643015 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.730839 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.820935 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28dbb22f-82bf-40db-8de3-1473d2aea429-config\") pod \"28dbb22f-82bf-40db-8de3-1473d2aea429\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.821017 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t28h\" (UniqueName: \"kubernetes.io/projected/28dbb22f-82bf-40db-8de3-1473d2aea429-kube-api-access-2t28h\") pod \"28dbb22f-82bf-40db-8de3-1473d2aea429\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.821138 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28dbb22f-82bf-40db-8de3-1473d2aea429-client-ca\") pod \"28dbb22f-82bf-40db-8de3-1473d2aea429\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.821284 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28dbb22f-82bf-40db-8de3-1473d2aea429-serving-cert\") pod \"28dbb22f-82bf-40db-8de3-1473d2aea429\" (UID: \"28dbb22f-82bf-40db-8de3-1473d2aea429\") " Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.822105 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28dbb22f-82bf-40db-8de3-1473d2aea429-client-ca" (OuterVolumeSpecName: "client-ca") pod "28dbb22f-82bf-40db-8de3-1473d2aea429" (UID: "28dbb22f-82bf-40db-8de3-1473d2aea429"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.822297 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28dbb22f-82bf-40db-8de3-1473d2aea429-config" (OuterVolumeSpecName: "config") pod "28dbb22f-82bf-40db-8de3-1473d2aea429" (UID: "28dbb22f-82bf-40db-8de3-1473d2aea429"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.823145 4742 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28dbb22f-82bf-40db-8de3-1473d2aea429-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.823183 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28dbb22f-82bf-40db-8de3-1473d2aea429-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.826772 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28dbb22f-82bf-40db-8de3-1473d2aea429-kube-api-access-2t28h" (OuterVolumeSpecName: "kube-api-access-2t28h") pod "28dbb22f-82bf-40db-8de3-1473d2aea429" (UID: "28dbb22f-82bf-40db-8de3-1473d2aea429"). InnerVolumeSpecName "kube-api-access-2t28h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.826767 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28dbb22f-82bf-40db-8de3-1473d2aea429-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "28dbb22f-82bf-40db-8de3-1473d2aea429" (UID: "28dbb22f-82bf-40db-8de3-1473d2aea429"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.924236 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-config\") pod \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.924291 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdbvl\" (UniqueName: \"kubernetes.io/projected/479f957a-0e3f-42e0-a6b2-05d1fa30984f-kube-api-access-pdbvl\") pod \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.924333 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-proxy-ca-bundles\") pod \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.924359 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/479f957a-0e3f-42e0-a6b2-05d1fa30984f-serving-cert\") pod \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.924416 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-client-ca\") pod \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\" (UID: \"479f957a-0e3f-42e0-a6b2-05d1fa30984f\") " Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.924789 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t28h\" (UniqueName: \"kubernetes.io/projected/28dbb22f-82bf-40db-8de3-1473d2aea429-kube-api-access-2t28h\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.924806 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28dbb22f-82bf-40db-8de3-1473d2aea429-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.925648 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "479f957a-0e3f-42e0-a6b2-05d1fa30984f" (UID: "479f957a-0e3f-42e0-a6b2-05d1fa30984f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.925730 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-config" (OuterVolumeSpecName: "config") pod "479f957a-0e3f-42e0-a6b2-05d1fa30984f" (UID: "479f957a-0e3f-42e0-a6b2-05d1fa30984f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.925717 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-client-ca" (OuterVolumeSpecName: "client-ca") pod "479f957a-0e3f-42e0-a6b2-05d1fa30984f" (UID: "479f957a-0e3f-42e0-a6b2-05d1fa30984f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.927382 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479f957a-0e3f-42e0-a6b2-05d1fa30984f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "479f957a-0e3f-42e0-a6b2-05d1fa30984f" (UID: "479f957a-0e3f-42e0-a6b2-05d1fa30984f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:16:49 crc kubenswrapper[4742]: I0317 11:16:49.927453 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479f957a-0e3f-42e0-a6b2-05d1fa30984f-kube-api-access-pdbvl" (OuterVolumeSpecName: "kube-api-access-pdbvl") pod "479f957a-0e3f-42e0-a6b2-05d1fa30984f" (UID: "479f957a-0e3f-42e0-a6b2-05d1fa30984f"). InnerVolumeSpecName "kube-api-access-pdbvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.025552 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.025580 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdbvl\" (UniqueName: \"kubernetes.io/projected/479f957a-0e3f-42e0-a6b2-05d1fa30984f-kube-api-access-pdbvl\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.025589 4742 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.025599 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/479f957a-0e3f-42e0-a6b2-05d1fa30984f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.025607 4742 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/479f957a-0e3f-42e0-a6b2-05d1fa30984f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.457825 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8"] Mar 17 11:16:50 crc kubenswrapper[4742]: E0317 11:16:50.458525 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f02c3898-7b15-4f51-a0ac-45a077355791" containerName="extract-content" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.458538 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02c3898-7b15-4f51-a0ac-45a077355791" containerName="extract-content" Mar 17 11:16:50 crc kubenswrapper[4742]: E0317 11:16:50.458555 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479f957a-0e3f-42e0-a6b2-05d1fa30984f" containerName="controller-manager" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.458561 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="479f957a-0e3f-42e0-a6b2-05d1fa30984f" containerName="controller-manager" Mar 17 11:16:50 crc kubenswrapper[4742]: E0317 11:16:50.458572 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f02c3898-7b15-4f51-a0ac-45a077355791" containerName="extract-utilities" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.458578 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02c3898-7b15-4f51-a0ac-45a077355791" containerName="extract-utilities" Mar 17 11:16:50 crc kubenswrapper[4742]: E0317 11:16:50.458586 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28dbb22f-82bf-40db-8de3-1473d2aea429" containerName="route-controller-manager" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.458592 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="28dbb22f-82bf-40db-8de3-1473d2aea429" containerName="route-controller-manager" Mar 17 11:16:50 crc kubenswrapper[4742]: E0317 11:16:50.458604 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f02c3898-7b15-4f51-a0ac-45a077355791" containerName="registry-server" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.458610 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02c3898-7b15-4f51-a0ac-45a077355791" containerName="registry-server" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.458698 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f02c3898-7b15-4f51-a0ac-45a077355791" containerName="registry-server" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.458711 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="28dbb22f-82bf-40db-8de3-1473d2aea429" containerName="route-controller-manager" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.458718 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="479f957a-0e3f-42e0-a6b2-05d1fa30984f" containerName="controller-manager" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.459124 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.467251 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh"] Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.470747 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.489523 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh"] Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.498374 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8"] Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.560242 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.560249 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7f7c985957-t8psd" event={"ID":"479f957a-0e3f-42e0-a6b2-05d1fa30984f","Type":"ContainerDied","Data":"d2a702a3b2e39e23f71d919bac5b9381266361f9ee72045fb7d1d4df74ab23b5"} Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.560445 4742 scope.go:117] "RemoveContainer" containerID="86d22f30e5e83ae4c5d4dca67e2ed6b52f85e69b919eeff9fed5260677cd6b8c" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.562511 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.562509 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j" event={"ID":"28dbb22f-82bf-40db-8de3-1473d2aea429","Type":"ContainerDied","Data":"c87c61152fbe0a03c81487a773e344393bac6a64c5768edde6807de7d7d5272f"} Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.584676 4742 scope.go:117] "RemoveContainer" containerID="855ae5474342446d428cfefc5b07d24c9d2ba27ba251bc9cb6e2412898b11865" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.592354 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j"] Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.595892 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57c9cf9d98-7t26j"] Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.610625 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7f7c985957-t8psd"] Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.613226 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7f7c985957-t8psd"] Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.635162 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-client-ca\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.635198 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae906b54-8d5a-416f-b549-94b44e689a47-serving-cert\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.635233 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtjd5\" (UniqueName: \"kubernetes.io/projected/af1ec505-bd47-4dd4-b03c-6623cc217450-kube-api-access-jtjd5\") pod \"route-controller-manager-7bffc89888-k49fh\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.635262 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-config\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.635389 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-proxy-ca-bundles\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.635415 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9j56\" (UniqueName: \"kubernetes.io/projected/ae906b54-8d5a-416f-b549-94b44e689a47-kube-api-access-z9j56\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.635459 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1ec505-bd47-4dd4-b03c-6623cc217450-serving-cert\") pod \"route-controller-manager-7bffc89888-k49fh\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.635791 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1ec505-bd47-4dd4-b03c-6623cc217450-config\") pod \"route-controller-manager-7bffc89888-k49fh\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.635826 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af1ec505-bd47-4dd4-b03c-6623cc217450-client-ca\") pod \"route-controller-manager-7bffc89888-k49fh\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.670533 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28dbb22f-82bf-40db-8de3-1473d2aea429" path="/var/lib/kubelet/pods/28dbb22f-82bf-40db-8de3-1473d2aea429/volumes" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.671640 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479f957a-0e3f-42e0-a6b2-05d1fa30984f" path="/var/lib/kubelet/pods/479f957a-0e3f-42e0-a6b2-05d1fa30984f/volumes" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.736357 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae906b54-8d5a-416f-b549-94b44e689a47-serving-cert\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.736403 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-client-ca\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.736438 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtjd5\" (UniqueName: \"kubernetes.io/projected/af1ec505-bd47-4dd4-b03c-6623cc217450-kube-api-access-jtjd5\") pod \"route-controller-manager-7bffc89888-k49fh\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.736464 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-config\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.736507 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-proxy-ca-bundles\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.737142 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9j56\" (UniqueName: \"kubernetes.io/projected/ae906b54-8d5a-416f-b549-94b44e689a47-kube-api-access-z9j56\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.737197 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1ec505-bd47-4dd4-b03c-6623cc217450-serving-cert\") pod \"route-controller-manager-7bffc89888-k49fh\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.738026 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-proxy-ca-bundles\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.738278 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-client-ca\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.738338 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-config\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.738442 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1ec505-bd47-4dd4-b03c-6623cc217450-config\") pod \"route-controller-manager-7bffc89888-k49fh\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.738462 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af1ec505-bd47-4dd4-b03c-6623cc217450-client-ca\") pod \"route-controller-manager-7bffc89888-k49fh\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.739584 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af1ec505-bd47-4dd4-b03c-6623cc217450-client-ca\") pod \"route-controller-manager-7bffc89888-k49fh\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.739936 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1ec505-bd47-4dd4-b03c-6623cc217450-config\") pod \"route-controller-manager-7bffc89888-k49fh\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.742144 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae906b54-8d5a-416f-b549-94b44e689a47-serving-cert\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.758099 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1ec505-bd47-4dd4-b03c-6623cc217450-serving-cert\") pod \"route-controller-manager-7bffc89888-k49fh\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.777875 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtjd5\" (UniqueName: \"kubernetes.io/projected/af1ec505-bd47-4dd4-b03c-6623cc217450-kube-api-access-jtjd5\") pod \"route-controller-manager-7bffc89888-k49fh\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.783264 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9j56\" (UniqueName: \"kubernetes.io/projected/ae906b54-8d5a-416f-b549-94b44e689a47-kube-api-access-z9j56\") pod \"controller-manager-54cfbc8d67-z6dz8\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:50 crc kubenswrapper[4742]: I0317 11:16:50.789782 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.050615 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh"] Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.077187 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.279028 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8"] Mar 17 11:16:51 crc kubenswrapper[4742]: W0317 11:16:51.285855 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae906b54_8d5a_416f_b549_94b44e689a47.slice/crio-9cf9484da796c878913934ccb1f43d4ed385a5786826d51926404c58310fd144 WatchSource:0}: Error finding container 9cf9484da796c878913934ccb1f43d4ed385a5786826d51926404c58310fd144: Status 404 returned error can't find the container with id 9cf9484da796c878913934ccb1f43d4ed385a5786826d51926404c58310fd144 Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.522029 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.522069 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.575754 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" event={"ID":"af1ec505-bd47-4dd4-b03c-6623cc217450","Type":"ContainerStarted","Data":"d26970f2a48b111722469cecbcaeadfeb60db4629842b508b8e57a82332e41a8"} Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.575810 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" event={"ID":"af1ec505-bd47-4dd4-b03c-6623cc217450","Type":"ContainerStarted","Data":"a557dbfbdb816209f104a48b3f5a96669aee98683e00cc61e8c409c75a458196"} Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.578520 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" event={"ID":"ae906b54-8d5a-416f-b549-94b44e689a47","Type":"ContainerStarted","Data":"9f60a582c3eaa9386306146335daf35af9ab01ec990dbfedfc10e4e63a7a68f4"} Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.578775 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" event={"ID":"ae906b54-8d5a-416f-b549-94b44e689a47","Type":"ContainerStarted","Data":"9cf9484da796c878913934ccb1f43d4ed385a5786826d51926404c58310fd144"} Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.579411 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.588045 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.590225 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.590716 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" podStartSLOduration=2.590698794 podStartE2EDuration="2.590698794s" podCreationTimestamp="2026-03-17 11:16:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:16:51.590475586 +0000 UTC m=+314.716603374" watchObservedRunningTime="2026-03-17 11:16:51.590698794 +0000 UTC m=+314.716826552" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.606121 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" podStartSLOduration=2.606099887 podStartE2EDuration="2.606099887s" podCreationTimestamp="2026-03-17 11:16:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:16:51.605289252 +0000 UTC m=+314.731417030" watchObservedRunningTime="2026-03-17 11:16:51.606099887 +0000 UTC m=+314.732227655" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.654308 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.871234 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.871927 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.914580 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.957997 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.958038 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:16:51 crc kubenswrapper[4742]: I0317 11:16:51.998962 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:16:52 crc kubenswrapper[4742]: I0317 11:16:52.181490 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:16:52 crc kubenswrapper[4742]: I0317 11:16:52.181546 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:16:52 crc kubenswrapper[4742]: I0317 11:16:52.227756 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:16:52 crc kubenswrapper[4742]: I0317 11:16:52.586628 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:52 crc kubenswrapper[4742]: I0317 11:16:52.602525 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:16:52 crc kubenswrapper[4742]: I0317 11:16:52.675034 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:16:52 crc kubenswrapper[4742]: I0317 11:16:52.675080 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:16:52 crc kubenswrapper[4742]: I0317 11:16:52.675102 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:16:53 crc kubenswrapper[4742]: I0317 11:16:53.568618 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzdpj"] Mar 17 11:16:53 crc kubenswrapper[4742]: I0317 11:16:53.696807 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:16:54 crc kubenswrapper[4742]: I0317 11:16:54.119064 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:16:54 crc kubenswrapper[4742]: I0317 11:16:54.169787 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x6ffv"] Mar 17 11:16:54 crc kubenswrapper[4742]: I0317 11:16:54.177350 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:16:54 crc kubenswrapper[4742]: I0317 11:16:54.596364 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x6ffv" podUID="e0543787-88e8-463d-b01b-694ecb854bfa" containerName="registry-server" containerID="cri-o://0584db02d32479030b8788b3be5bf68ebb3fd552ce14569ce6c7774598f6806b" gracePeriod=2 Mar 17 11:16:54 crc kubenswrapper[4742]: I0317 11:16:54.596627 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qzdpj" podUID="c8e4be20-e918-45d7-b026-12ef2abf3462" containerName="registry-server" containerID="cri-o://3215c894254e2324910b80266291f0d92c4ac5bb237be83b41b750a846978808" gracePeriod=2 Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.120847 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.124883 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.215682 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7nqr\" (UniqueName: \"kubernetes.io/projected/e0543787-88e8-463d-b01b-694ecb854bfa-kube-api-access-p7nqr\") pod \"e0543787-88e8-463d-b01b-694ecb854bfa\" (UID: \"e0543787-88e8-463d-b01b-694ecb854bfa\") " Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.215785 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8e4be20-e918-45d7-b026-12ef2abf3462-utilities\") pod \"c8e4be20-e918-45d7-b026-12ef2abf3462\" (UID: \"c8e4be20-e918-45d7-b026-12ef2abf3462\") " Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.215821 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0543787-88e8-463d-b01b-694ecb854bfa-utilities\") pod \"e0543787-88e8-463d-b01b-694ecb854bfa\" (UID: \"e0543787-88e8-463d-b01b-694ecb854bfa\") " Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.215901 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8e4be20-e918-45d7-b026-12ef2abf3462-catalog-content\") pod \"c8e4be20-e918-45d7-b026-12ef2abf3462\" (UID: \"c8e4be20-e918-45d7-b026-12ef2abf3462\") " Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.216011 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fmck\" (UniqueName: \"kubernetes.io/projected/c8e4be20-e918-45d7-b026-12ef2abf3462-kube-api-access-9fmck\") pod \"c8e4be20-e918-45d7-b026-12ef2abf3462\" (UID: \"c8e4be20-e918-45d7-b026-12ef2abf3462\") " Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.216079 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0543787-88e8-463d-b01b-694ecb854bfa-catalog-content\") pod \"e0543787-88e8-463d-b01b-694ecb854bfa\" (UID: \"e0543787-88e8-463d-b01b-694ecb854bfa\") " Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.217228 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8e4be20-e918-45d7-b026-12ef2abf3462-utilities" (OuterVolumeSpecName: "utilities") pod "c8e4be20-e918-45d7-b026-12ef2abf3462" (UID: "c8e4be20-e918-45d7-b026-12ef2abf3462"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.218663 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0543787-88e8-463d-b01b-694ecb854bfa-utilities" (OuterVolumeSpecName: "utilities") pod "e0543787-88e8-463d-b01b-694ecb854bfa" (UID: "e0543787-88e8-463d-b01b-694ecb854bfa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.223662 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8e4be20-e918-45d7-b026-12ef2abf3462-kube-api-access-9fmck" (OuterVolumeSpecName: "kube-api-access-9fmck") pod "c8e4be20-e918-45d7-b026-12ef2abf3462" (UID: "c8e4be20-e918-45d7-b026-12ef2abf3462"). InnerVolumeSpecName "kube-api-access-9fmck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.225287 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0543787-88e8-463d-b01b-694ecb854bfa-kube-api-access-p7nqr" (OuterVolumeSpecName: "kube-api-access-p7nqr") pod "e0543787-88e8-463d-b01b-694ecb854bfa" (UID: "e0543787-88e8-463d-b01b-694ecb854bfa"). InnerVolumeSpecName "kube-api-access-p7nqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.277159 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8e4be20-e918-45d7-b026-12ef2abf3462-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8e4be20-e918-45d7-b026-12ef2abf3462" (UID: "c8e4be20-e918-45d7-b026-12ef2abf3462"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.285792 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0543787-88e8-463d-b01b-694ecb854bfa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0543787-88e8-463d-b01b-694ecb854bfa" (UID: "e0543787-88e8-463d-b01b-694ecb854bfa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.317413 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7nqr\" (UniqueName: \"kubernetes.io/projected/e0543787-88e8-463d-b01b-694ecb854bfa-kube-api-access-p7nqr\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.317464 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8e4be20-e918-45d7-b026-12ef2abf3462-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.317478 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0543787-88e8-463d-b01b-694ecb854bfa-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.317488 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8e4be20-e918-45d7-b026-12ef2abf3462-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.317503 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fmck\" (UniqueName: \"kubernetes.io/projected/c8e4be20-e918-45d7-b026-12ef2abf3462-kube-api-access-9fmck\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.317515 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0543787-88e8-463d-b01b-694ecb854bfa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.603064 4742 generic.go:334] "Generic (PLEG): container finished" podID="c8e4be20-e918-45d7-b026-12ef2abf3462" containerID="3215c894254e2324910b80266291f0d92c4ac5bb237be83b41b750a846978808" exitCode=0 Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.603156 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzdpj" event={"ID":"c8e4be20-e918-45d7-b026-12ef2abf3462","Type":"ContainerDied","Data":"3215c894254e2324910b80266291f0d92c4ac5bb237be83b41b750a846978808"} Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.603493 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzdpj" event={"ID":"c8e4be20-e918-45d7-b026-12ef2abf3462","Type":"ContainerDied","Data":"133ee933f61944ecf40afae60eaafcd8d818ce9cd470fcf65c90b3c6136a2835"} Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.603183 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzdpj" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.603526 4742 scope.go:117] "RemoveContainer" containerID="3215c894254e2324910b80266291f0d92c4ac5bb237be83b41b750a846978808" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.606319 4742 generic.go:334] "Generic (PLEG): container finished" podID="e0543787-88e8-463d-b01b-694ecb854bfa" containerID="0584db02d32479030b8788b3be5bf68ebb3fd552ce14569ce6c7774598f6806b" exitCode=0 Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.606356 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6ffv" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.606366 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6ffv" event={"ID":"e0543787-88e8-463d-b01b-694ecb854bfa","Type":"ContainerDied","Data":"0584db02d32479030b8788b3be5bf68ebb3fd552ce14569ce6c7774598f6806b"} Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.606398 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6ffv" event={"ID":"e0543787-88e8-463d-b01b-694ecb854bfa","Type":"ContainerDied","Data":"f8a18257578e5a388701bfa5e3da59d0226c24a9619a3a87268ebc3b8bfd9611"} Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.628880 4742 scope.go:117] "RemoveContainer" containerID="87a4c635aa94ef01f09f17a7f79683196bcf114d0c4bac75218e53183a620263" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.641019 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x6ffv"] Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.650177 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x6ffv"] Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.660341 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzdpj"] Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.660919 4742 scope.go:117] "RemoveContainer" containerID="bb65bae714172788645ff2c5182bf02c58aab2c62900ead05c1f40fa4a83df45" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.665346 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qzdpj"] Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.675198 4742 scope.go:117] "RemoveContainer" containerID="3215c894254e2324910b80266291f0d92c4ac5bb237be83b41b750a846978808" Mar 17 11:16:55 crc kubenswrapper[4742]: E0317 11:16:55.675848 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3215c894254e2324910b80266291f0d92c4ac5bb237be83b41b750a846978808\": container with ID starting with 3215c894254e2324910b80266291f0d92c4ac5bb237be83b41b750a846978808 not found: ID does not exist" containerID="3215c894254e2324910b80266291f0d92c4ac5bb237be83b41b750a846978808" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.675922 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3215c894254e2324910b80266291f0d92c4ac5bb237be83b41b750a846978808"} err="failed to get container status \"3215c894254e2324910b80266291f0d92c4ac5bb237be83b41b750a846978808\": rpc error: code = NotFound desc = could not find container \"3215c894254e2324910b80266291f0d92c4ac5bb237be83b41b750a846978808\": container with ID starting with 3215c894254e2324910b80266291f0d92c4ac5bb237be83b41b750a846978808 not found: ID does not exist" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.675948 4742 scope.go:117] "RemoveContainer" containerID="87a4c635aa94ef01f09f17a7f79683196bcf114d0c4bac75218e53183a620263" Mar 17 11:16:55 crc kubenswrapper[4742]: E0317 11:16:55.676208 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87a4c635aa94ef01f09f17a7f79683196bcf114d0c4bac75218e53183a620263\": container with ID starting with 87a4c635aa94ef01f09f17a7f79683196bcf114d0c4bac75218e53183a620263 not found: ID does not exist" containerID="87a4c635aa94ef01f09f17a7f79683196bcf114d0c4bac75218e53183a620263" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.676254 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a4c635aa94ef01f09f17a7f79683196bcf114d0c4bac75218e53183a620263"} err="failed to get container status \"87a4c635aa94ef01f09f17a7f79683196bcf114d0c4bac75218e53183a620263\": rpc error: code = NotFound desc = could not find container \"87a4c635aa94ef01f09f17a7f79683196bcf114d0c4bac75218e53183a620263\": container with ID starting with 87a4c635aa94ef01f09f17a7f79683196bcf114d0c4bac75218e53183a620263 not found: ID does not exist" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.676269 4742 scope.go:117] "RemoveContainer" containerID="bb65bae714172788645ff2c5182bf02c58aab2c62900ead05c1f40fa4a83df45" Mar 17 11:16:55 crc kubenswrapper[4742]: E0317 11:16:55.677086 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb65bae714172788645ff2c5182bf02c58aab2c62900ead05c1f40fa4a83df45\": container with ID starting with bb65bae714172788645ff2c5182bf02c58aab2c62900ead05c1f40fa4a83df45 not found: ID does not exist" containerID="bb65bae714172788645ff2c5182bf02c58aab2c62900ead05c1f40fa4a83df45" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.677156 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb65bae714172788645ff2c5182bf02c58aab2c62900ead05c1f40fa4a83df45"} err="failed to get container status \"bb65bae714172788645ff2c5182bf02c58aab2c62900ead05c1f40fa4a83df45\": rpc error: code = NotFound desc = could not find container \"bb65bae714172788645ff2c5182bf02c58aab2c62900ead05c1f40fa4a83df45\": container with ID starting with bb65bae714172788645ff2c5182bf02c58aab2c62900ead05c1f40fa4a83df45 not found: ID does not exist" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.677191 4742 scope.go:117] "RemoveContainer" containerID="0584db02d32479030b8788b3be5bf68ebb3fd552ce14569ce6c7774598f6806b" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.701276 4742 scope.go:117] "RemoveContainer" containerID="cdcf34cff56f54047d55cef192e5b039a333bf35ae0b417a5691100bee627c83" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.735205 4742 scope.go:117] "RemoveContainer" containerID="ebf507ac9ceaca14caee44b4fa7a51a395a9d11e4dd52e60815e1633cbdf3fe8" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.761248 4742 scope.go:117] "RemoveContainer" containerID="0584db02d32479030b8788b3be5bf68ebb3fd552ce14569ce6c7774598f6806b" Mar 17 11:16:55 crc kubenswrapper[4742]: E0317 11:16:55.762007 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0584db02d32479030b8788b3be5bf68ebb3fd552ce14569ce6c7774598f6806b\": container with ID starting with 0584db02d32479030b8788b3be5bf68ebb3fd552ce14569ce6c7774598f6806b not found: ID does not exist" containerID="0584db02d32479030b8788b3be5bf68ebb3fd552ce14569ce6c7774598f6806b" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.762043 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0584db02d32479030b8788b3be5bf68ebb3fd552ce14569ce6c7774598f6806b"} err="failed to get container status \"0584db02d32479030b8788b3be5bf68ebb3fd552ce14569ce6c7774598f6806b\": rpc error: code = NotFound desc = could not find container \"0584db02d32479030b8788b3be5bf68ebb3fd552ce14569ce6c7774598f6806b\": container with ID starting with 0584db02d32479030b8788b3be5bf68ebb3fd552ce14569ce6c7774598f6806b not found: ID does not exist" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.762069 4742 scope.go:117] "RemoveContainer" containerID="cdcf34cff56f54047d55cef192e5b039a333bf35ae0b417a5691100bee627c83" Mar 17 11:16:55 crc kubenswrapper[4742]: E0317 11:16:55.762487 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdcf34cff56f54047d55cef192e5b039a333bf35ae0b417a5691100bee627c83\": container with ID starting with cdcf34cff56f54047d55cef192e5b039a333bf35ae0b417a5691100bee627c83 not found: ID does not exist" containerID="cdcf34cff56f54047d55cef192e5b039a333bf35ae0b417a5691100bee627c83" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.762519 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdcf34cff56f54047d55cef192e5b039a333bf35ae0b417a5691100bee627c83"} err="failed to get container status \"cdcf34cff56f54047d55cef192e5b039a333bf35ae0b417a5691100bee627c83\": rpc error: code = NotFound desc = could not find container \"cdcf34cff56f54047d55cef192e5b039a333bf35ae0b417a5691100bee627c83\": container with ID starting with cdcf34cff56f54047d55cef192e5b039a333bf35ae0b417a5691100bee627c83 not found: ID does not exist" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.762557 4742 scope.go:117] "RemoveContainer" containerID="ebf507ac9ceaca14caee44b4fa7a51a395a9d11e4dd52e60815e1633cbdf3fe8" Mar 17 11:16:55 crc kubenswrapper[4742]: E0317 11:16:55.763036 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebf507ac9ceaca14caee44b4fa7a51a395a9d11e4dd52e60815e1633cbdf3fe8\": container with ID starting with ebf507ac9ceaca14caee44b4fa7a51a395a9d11e4dd52e60815e1633cbdf3fe8 not found: ID does not exist" containerID="ebf507ac9ceaca14caee44b4fa7a51a395a9d11e4dd52e60815e1633cbdf3fe8" Mar 17 11:16:55 crc kubenswrapper[4742]: I0317 11:16:55.763077 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebf507ac9ceaca14caee44b4fa7a51a395a9d11e4dd52e60815e1633cbdf3fe8"} err="failed to get container status \"ebf507ac9ceaca14caee44b4fa7a51a395a9d11e4dd52e60815e1633cbdf3fe8\": rpc error: code = NotFound desc = could not find container \"ebf507ac9ceaca14caee44b4fa7a51a395a9d11e4dd52e60815e1633cbdf3fe8\": container with ID starting with ebf507ac9ceaca14caee44b4fa7a51a395a9d11e4dd52e60815e1633cbdf3fe8 not found: ID does not exist" Mar 17 11:16:56 crc kubenswrapper[4742]: I0317 11:16:56.568689 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mvgb"] Mar 17 11:16:56 crc kubenswrapper[4742]: I0317 11:16:56.569142 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6mvgb" podUID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" containerName="registry-server" containerID="cri-o://180c4598b6b78e487454c88e69b19b12636da56e0db520083d00fcd0cd4efd57" gracePeriod=2 Mar 17 11:16:56 crc kubenswrapper[4742]: I0317 11:16:56.669157 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8e4be20-e918-45d7-b026-12ef2abf3462" path="/var/lib/kubelet/pods/c8e4be20-e918-45d7-b026-12ef2abf3462/volumes" Mar 17 11:16:56 crc kubenswrapper[4742]: I0317 11:16:56.670021 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0543787-88e8-463d-b01b-694ecb854bfa" path="/var/lib/kubelet/pods/e0543787-88e8-463d-b01b-694ecb854bfa/volumes" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.023630 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.040602 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9xsm\" (UniqueName: \"kubernetes.io/projected/d416e1fd-2137-48a4-b933-b25a9ca94a8a-kube-api-access-s9xsm\") pod \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\" (UID: \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\") " Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.040718 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d416e1fd-2137-48a4-b933-b25a9ca94a8a-utilities\") pod \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\" (UID: \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\") " Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.040783 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d416e1fd-2137-48a4-b933-b25a9ca94a8a-catalog-content\") pod \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\" (UID: \"d416e1fd-2137-48a4-b933-b25a9ca94a8a\") " Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.042896 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d416e1fd-2137-48a4-b933-b25a9ca94a8a-utilities" (OuterVolumeSpecName: "utilities") pod "d416e1fd-2137-48a4-b933-b25a9ca94a8a" (UID: "d416e1fd-2137-48a4-b933-b25a9ca94a8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.071293 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d416e1fd-2137-48a4-b933-b25a9ca94a8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d416e1fd-2137-48a4-b933-b25a9ca94a8a" (UID: "d416e1fd-2137-48a4-b933-b25a9ca94a8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.073109 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d416e1fd-2137-48a4-b933-b25a9ca94a8a-kube-api-access-s9xsm" (OuterVolumeSpecName: "kube-api-access-s9xsm") pod "d416e1fd-2137-48a4-b933-b25a9ca94a8a" (UID: "d416e1fd-2137-48a4-b933-b25a9ca94a8a"). InnerVolumeSpecName "kube-api-access-s9xsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.142859 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9xsm\" (UniqueName: \"kubernetes.io/projected/d416e1fd-2137-48a4-b933-b25a9ca94a8a-kube-api-access-s9xsm\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.142955 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d416e1fd-2137-48a4-b933-b25a9ca94a8a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.142984 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d416e1fd-2137-48a4-b933-b25a9ca94a8a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.625131 4742 generic.go:334] "Generic (PLEG): container finished" podID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" containerID="180c4598b6b78e487454c88e69b19b12636da56e0db520083d00fcd0cd4efd57" exitCode=0 Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.625468 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mvgb" event={"ID":"d416e1fd-2137-48a4-b933-b25a9ca94a8a","Type":"ContainerDied","Data":"180c4598b6b78e487454c88e69b19b12636da56e0db520083d00fcd0cd4efd57"} Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.625511 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mvgb" event={"ID":"d416e1fd-2137-48a4-b933-b25a9ca94a8a","Type":"ContainerDied","Data":"782e31003da6e9c2ce12c25c7cfdb411809080f461c17ec85b30bd2d0a523261"} Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.625548 4742 scope.go:117] "RemoveContainer" containerID="180c4598b6b78e487454c88e69b19b12636da56e0db520083d00fcd0cd4efd57" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.625709 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mvgb" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.660697 4742 scope.go:117] "RemoveContainer" containerID="6ed674a64a5dc5b434ad8ef82af06dd2fc6ac4452025080af4763f35894c9c9e" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.681145 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mvgb"] Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.684807 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mvgb"] Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.706266 4742 scope.go:117] "RemoveContainer" containerID="777778a0954a81b2fc2dc89e3f6cdef1faeae71a25103e1258b624d944e961bf" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.722957 4742 scope.go:117] "RemoveContainer" containerID="180c4598b6b78e487454c88e69b19b12636da56e0db520083d00fcd0cd4efd57" Mar 17 11:16:57 crc kubenswrapper[4742]: E0317 11:16:57.723403 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"180c4598b6b78e487454c88e69b19b12636da56e0db520083d00fcd0cd4efd57\": container with ID starting with 180c4598b6b78e487454c88e69b19b12636da56e0db520083d00fcd0cd4efd57 not found: ID does not exist" containerID="180c4598b6b78e487454c88e69b19b12636da56e0db520083d00fcd0cd4efd57" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.723439 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"180c4598b6b78e487454c88e69b19b12636da56e0db520083d00fcd0cd4efd57"} err="failed to get container status \"180c4598b6b78e487454c88e69b19b12636da56e0db520083d00fcd0cd4efd57\": rpc error: code = NotFound desc = could not find container \"180c4598b6b78e487454c88e69b19b12636da56e0db520083d00fcd0cd4efd57\": container with ID starting with 180c4598b6b78e487454c88e69b19b12636da56e0db520083d00fcd0cd4efd57 not found: ID does not exist" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.723463 4742 scope.go:117] "RemoveContainer" containerID="6ed674a64a5dc5b434ad8ef82af06dd2fc6ac4452025080af4763f35894c9c9e" Mar 17 11:16:57 crc kubenswrapper[4742]: E0317 11:16:57.723776 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ed674a64a5dc5b434ad8ef82af06dd2fc6ac4452025080af4763f35894c9c9e\": container with ID starting with 6ed674a64a5dc5b434ad8ef82af06dd2fc6ac4452025080af4763f35894c9c9e not found: ID does not exist" containerID="6ed674a64a5dc5b434ad8ef82af06dd2fc6ac4452025080af4763f35894c9c9e" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.723800 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ed674a64a5dc5b434ad8ef82af06dd2fc6ac4452025080af4763f35894c9c9e"} err="failed to get container status \"6ed674a64a5dc5b434ad8ef82af06dd2fc6ac4452025080af4763f35894c9c9e\": rpc error: code = NotFound desc = could not find container \"6ed674a64a5dc5b434ad8ef82af06dd2fc6ac4452025080af4763f35894c9c9e\": container with ID starting with 6ed674a64a5dc5b434ad8ef82af06dd2fc6ac4452025080af4763f35894c9c9e not found: ID does not exist" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.723820 4742 scope.go:117] "RemoveContainer" containerID="777778a0954a81b2fc2dc89e3f6cdef1faeae71a25103e1258b624d944e961bf" Mar 17 11:16:57 crc kubenswrapper[4742]: E0317 11:16:57.724185 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777778a0954a81b2fc2dc89e3f6cdef1faeae71a25103e1258b624d944e961bf\": container with ID starting with 777778a0954a81b2fc2dc89e3f6cdef1faeae71a25103e1258b624d944e961bf not found: ID does not exist" containerID="777778a0954a81b2fc2dc89e3f6cdef1faeae71a25103e1258b624d944e961bf" Mar 17 11:16:57 crc kubenswrapper[4742]: I0317 11:16:57.724212 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777778a0954a81b2fc2dc89e3f6cdef1faeae71a25103e1258b624d944e961bf"} err="failed to get container status \"777778a0954a81b2fc2dc89e3f6cdef1faeae71a25103e1258b624d944e961bf\": rpc error: code = NotFound desc = could not find container \"777778a0954a81b2fc2dc89e3f6cdef1faeae71a25103e1258b624d944e961bf\": container with ID starting with 777778a0954a81b2fc2dc89e3f6cdef1faeae71a25103e1258b624d944e961bf not found: ID does not exist" Mar 17 11:16:58 crc kubenswrapper[4742]: I0317 11:16:58.673421 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" path="/var/lib/kubelet/pods/d416e1fd-2137-48a4-b933-b25a9ca94a8a/volumes" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.046309 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8"] Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.046996 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" podUID="ae906b54-8d5a-416f-b549-94b44e689a47" containerName="controller-manager" containerID="cri-o://9f60a582c3eaa9386306146335daf35af9ab01ec990dbfedfc10e4e63a7a68f4" gracePeriod=30 Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.078507 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" podUID="61b81f5a-30d8-4c88-899b-5effb490bdee" containerName="oauth-openshift" containerID="cri-o://394aa3c16e64ab8ab054db1eea6d6fdde6e1172c949c1fcb9ef49e6d47f51abd" gracePeriod=15 Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.154332 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh"] Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.154562 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" podUID="af1ec505-bd47-4dd4-b03c-6623cc217450" containerName="route-controller-manager" containerID="cri-o://d26970f2a48b111722469cecbcaeadfeb60db4629842b508b8e57a82332e41a8" gracePeriod=30 Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.616666 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.621480 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.644343 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.697700 4742 generic.go:334] "Generic (PLEG): container finished" podID="af1ec505-bd47-4dd4-b03c-6623cc217450" containerID="d26970f2a48b111722469cecbcaeadfeb60db4629842b508b8e57a82332e41a8" exitCode=0 Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.697782 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" event={"ID":"af1ec505-bd47-4dd4-b03c-6623cc217450","Type":"ContainerDied","Data":"d26970f2a48b111722469cecbcaeadfeb60db4629842b508b8e57a82332e41a8"} Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.697809 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" event={"ID":"af1ec505-bd47-4dd4-b03c-6623cc217450","Type":"ContainerDied","Data":"a557dbfbdb816209f104a48b3f5a96669aee98683e00cc61e8c409c75a458196"} Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.697853 4742 scope.go:117] "RemoveContainer" containerID="d26970f2a48b111722469cecbcaeadfeb60db4629842b508b8e57a82332e41a8" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.698072 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.701525 4742 generic.go:334] "Generic (PLEG): container finished" podID="61b81f5a-30d8-4c88-899b-5effb490bdee" containerID="394aa3c16e64ab8ab054db1eea6d6fdde6e1172c949c1fcb9ef49e6d47f51abd" exitCode=0 Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.701713 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" event={"ID":"61b81f5a-30d8-4c88-899b-5effb490bdee","Type":"ContainerDied","Data":"394aa3c16e64ab8ab054db1eea6d6fdde6e1172c949c1fcb9ef49e6d47f51abd"} Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.701766 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" event={"ID":"61b81f5a-30d8-4c88-899b-5effb490bdee","Type":"ContainerDied","Data":"9064259a5caadf070f835f74744e684fe009d596a72c7a92e1c453129c9dcfb7"} Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.701859 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-spkdx" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709222 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae906b54-8d5a-416f-b549-94b44e689a47-serving-cert\") pod \"ae906b54-8d5a-416f-b549-94b44e689a47\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709268 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtjd5\" (UniqueName: \"kubernetes.io/projected/af1ec505-bd47-4dd4-b03c-6623cc217450-kube-api-access-jtjd5\") pod \"af1ec505-bd47-4dd4-b03c-6623cc217450\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709292 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-service-ca\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709315 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-config\") pod \"ae906b54-8d5a-416f-b549-94b44e689a47\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709338 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-session\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709366 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1ec505-bd47-4dd4-b03c-6623cc217450-serving-cert\") pod \"af1ec505-bd47-4dd4-b03c-6623cc217450\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709387 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-provider-selection\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709411 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-idp-0-file-data\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709457 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-proxy-ca-bundles\") pod \"ae906b54-8d5a-416f-b549-94b44e689a47\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709477 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-ocp-branding-template\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709499 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9j56\" (UniqueName: \"kubernetes.io/projected/ae906b54-8d5a-416f-b549-94b44e689a47-kube-api-access-z9j56\") pod \"ae906b54-8d5a-416f-b549-94b44e689a47\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709522 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-router-certs\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709547 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-trusted-ca-bundle\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709569 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1ec505-bd47-4dd4-b03c-6623cc217450-config\") pod \"af1ec505-bd47-4dd4-b03c-6623cc217450\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709593 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvhnt\" (UniqueName: \"kubernetes.io/projected/61b81f5a-30d8-4c88-899b-5effb490bdee-kube-api-access-jvhnt\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709613 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-cliconfig\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709636 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/61b81f5a-30d8-4c88-899b-5effb490bdee-audit-dir\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709677 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-login\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709707 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-error\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709727 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af1ec505-bd47-4dd4-b03c-6623cc217450-client-ca\") pod \"af1ec505-bd47-4dd4-b03c-6623cc217450\" (UID: \"af1ec505-bd47-4dd4-b03c-6623cc217450\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709754 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-client-ca\") pod \"ae906b54-8d5a-416f-b549-94b44e689a47\" (UID: \"ae906b54-8d5a-416f-b549-94b44e689a47\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709771 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-audit-policies\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.709788 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-serving-cert\") pod \"61b81f5a-30d8-4c88-899b-5effb490bdee\" (UID: \"61b81f5a-30d8-4c88-899b-5effb490bdee\") " Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.710878 4742 generic.go:334] "Generic (PLEG): container finished" podID="ae906b54-8d5a-416f-b549-94b44e689a47" containerID="9f60a582c3eaa9386306146335daf35af9ab01ec990dbfedfc10e4e63a7a68f4" exitCode=0 Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.710931 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" event={"ID":"ae906b54-8d5a-416f-b549-94b44e689a47","Type":"ContainerDied","Data":"9f60a582c3eaa9386306146335daf35af9ab01ec990dbfedfc10e4e63a7a68f4"} Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.710970 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" event={"ID":"ae906b54-8d5a-416f-b549-94b44e689a47","Type":"ContainerDied","Data":"9cf9484da796c878913934ccb1f43d4ed385a5786826d51926404c58310fd144"} Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.711048 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.715323 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.715389 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-config" (OuterVolumeSpecName: "config") pod "ae906b54-8d5a-416f-b549-94b44e689a47" (UID: "ae906b54-8d5a-416f-b549-94b44e689a47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.716224 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ae906b54-8d5a-416f-b549-94b44e689a47" (UID: "ae906b54-8d5a-416f-b549-94b44e689a47"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.716379 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.716965 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61b81f5a-30d8-4c88-899b-5effb490bdee-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.717716 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-client-ca" (OuterVolumeSpecName: "client-ca") pod "ae906b54-8d5a-416f-b549-94b44e689a47" (UID: "ae906b54-8d5a-416f-b549-94b44e689a47"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.719447 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1ec505-bd47-4dd4-b03c-6623cc217450-client-ca" (OuterVolumeSpecName: "client-ca") pod "af1ec505-bd47-4dd4-b03c-6623cc217450" (UID: "af1ec505-bd47-4dd4-b03c-6623cc217450"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.719755 4742 scope.go:117] "RemoveContainer" containerID="d26970f2a48b111722469cecbcaeadfeb60db4629842b508b8e57a82332e41a8" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.720049 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.721459 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.721710 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: E0317 11:17:09.721707 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d26970f2a48b111722469cecbcaeadfeb60db4629842b508b8e57a82332e41a8\": container with ID starting with d26970f2a48b111722469cecbcaeadfeb60db4629842b508b8e57a82332e41a8 not found: ID does not exist" containerID="d26970f2a48b111722469cecbcaeadfeb60db4629842b508b8e57a82332e41a8" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.721750 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26970f2a48b111722469cecbcaeadfeb60db4629842b508b8e57a82332e41a8"} err="failed to get container status \"d26970f2a48b111722469cecbcaeadfeb60db4629842b508b8e57a82332e41a8\": rpc error: code = NotFound desc = could not find container \"d26970f2a48b111722469cecbcaeadfeb60db4629842b508b8e57a82332e41a8\": container with ID starting with d26970f2a48b111722469cecbcaeadfeb60db4629842b508b8e57a82332e41a8 not found: ID does not exist" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.721772 4742 scope.go:117] "RemoveContainer" containerID="394aa3c16e64ab8ab054db1eea6d6fdde6e1172c949c1fcb9ef49e6d47f51abd" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.722133 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1ec505-bd47-4dd4-b03c-6623cc217450-kube-api-access-jtjd5" (OuterVolumeSpecName: "kube-api-access-jtjd5") pod "af1ec505-bd47-4dd4-b03c-6623cc217450" (UID: "af1ec505-bd47-4dd4-b03c-6623cc217450"). InnerVolumeSpecName "kube-api-access-jtjd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.722326 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1ec505-bd47-4dd4-b03c-6623cc217450-config" (OuterVolumeSpecName: "config") pod "af1ec505-bd47-4dd4-b03c-6623cc217450" (UID: "af1ec505-bd47-4dd4-b03c-6623cc217450"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.722665 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.722826 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1ec505-bd47-4dd4-b03c-6623cc217450-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "af1ec505-bd47-4dd4-b03c-6623cc217450" (UID: "af1ec505-bd47-4dd4-b03c-6623cc217450"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.724030 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.724986 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.732957 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.732964 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61b81f5a-30d8-4c88-899b-5effb490bdee-kube-api-access-jvhnt" (OuterVolumeSpecName: "kube-api-access-jvhnt") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "kube-api-access-jvhnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.733195 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae906b54-8d5a-416f-b549-94b44e689a47-kube-api-access-z9j56" (OuterVolumeSpecName: "kube-api-access-z9j56") pod "ae906b54-8d5a-416f-b549-94b44e689a47" (UID: "ae906b54-8d5a-416f-b549-94b44e689a47"). InnerVolumeSpecName "kube-api-access-z9j56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.733260 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae906b54-8d5a-416f-b549-94b44e689a47-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ae906b54-8d5a-416f-b549-94b44e689a47" (UID: "ae906b54-8d5a-416f-b549-94b44e689a47"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.733480 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.736148 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.736252 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "61b81f5a-30d8-4c88-899b-5effb490bdee" (UID: "61b81f5a-30d8-4c88-899b-5effb490bdee"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.744240 4742 scope.go:117] "RemoveContainer" containerID="394aa3c16e64ab8ab054db1eea6d6fdde6e1172c949c1fcb9ef49e6d47f51abd" Mar 17 11:17:09 crc kubenswrapper[4742]: E0317 11:17:09.746689 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"394aa3c16e64ab8ab054db1eea6d6fdde6e1172c949c1fcb9ef49e6d47f51abd\": container with ID starting with 394aa3c16e64ab8ab054db1eea6d6fdde6e1172c949c1fcb9ef49e6d47f51abd not found: ID does not exist" containerID="394aa3c16e64ab8ab054db1eea6d6fdde6e1172c949c1fcb9ef49e6d47f51abd" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.746734 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"394aa3c16e64ab8ab054db1eea6d6fdde6e1172c949c1fcb9ef49e6d47f51abd"} err="failed to get container status \"394aa3c16e64ab8ab054db1eea6d6fdde6e1172c949c1fcb9ef49e6d47f51abd\": rpc error: code = NotFound desc = could not find container \"394aa3c16e64ab8ab054db1eea6d6fdde6e1172c949c1fcb9ef49e6d47f51abd\": container with ID starting with 394aa3c16e64ab8ab054db1eea6d6fdde6e1172c949c1fcb9ef49e6d47f51abd not found: ID does not exist" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.746760 4742 scope.go:117] "RemoveContainer" containerID="9f60a582c3eaa9386306146335daf35af9ab01ec990dbfedfc10e4e63a7a68f4" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.772556 4742 scope.go:117] "RemoveContainer" containerID="9f60a582c3eaa9386306146335daf35af9ab01ec990dbfedfc10e4e63a7a68f4" Mar 17 11:17:09 crc kubenswrapper[4742]: E0317 11:17:09.777040 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f60a582c3eaa9386306146335daf35af9ab01ec990dbfedfc10e4e63a7a68f4\": container with ID starting with 9f60a582c3eaa9386306146335daf35af9ab01ec990dbfedfc10e4e63a7a68f4 not found: ID does not exist" containerID="9f60a582c3eaa9386306146335daf35af9ab01ec990dbfedfc10e4e63a7a68f4" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.777085 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f60a582c3eaa9386306146335daf35af9ab01ec990dbfedfc10e4e63a7a68f4"} err="failed to get container status \"9f60a582c3eaa9386306146335daf35af9ab01ec990dbfedfc10e4e63a7a68f4\": rpc error: code = NotFound desc = could not find container \"9f60a582c3eaa9386306146335daf35af9ab01ec990dbfedfc10e4e63a7a68f4\": container with ID starting with 9f60a582c3eaa9386306146335daf35af9ab01ec990dbfedfc10e4e63a7a68f4 not found: ID does not exist" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.810980 4742 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811008 4742 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811017 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811027 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae906b54-8d5a-416f-b549-94b44e689a47-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811037 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtjd5\" (UniqueName: \"kubernetes.io/projected/af1ec505-bd47-4dd4-b03c-6623cc217450-kube-api-access-jtjd5\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811046 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811054 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811062 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811072 4742 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1ec505-bd47-4dd4-b03c-6623cc217450-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811080 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811089 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811098 4742 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ae906b54-8d5a-416f-b549-94b44e689a47-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811108 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811117 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9j56\" (UniqueName: \"kubernetes.io/projected/ae906b54-8d5a-416f-b549-94b44e689a47-kube-api-access-z9j56\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811126 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811134 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811142 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1ec505-bd47-4dd4-b03c-6623cc217450-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811150 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvhnt\" (UniqueName: \"kubernetes.io/projected/61b81f5a-30d8-4c88-899b-5effb490bdee-kube-api-access-jvhnt\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811158 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811166 4742 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/61b81f5a-30d8-4c88-899b-5effb490bdee-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811174 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811182 4742 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/61b81f5a-30d8-4c88-899b-5effb490bdee-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:09 crc kubenswrapper[4742]: I0317 11:17:09.811191 4742 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af1ec505-bd47-4dd4-b03c-6623cc217450-client-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.030098 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh"] Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.036116 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bffc89888-k49fh"] Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.045033 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-spkdx"] Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.047641 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-spkdx"] Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.055364 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8"] Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.057616 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54cfbc8d67-z6dz8"] Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475422 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf"] Mar 17 11:17:10 crc kubenswrapper[4742]: E0317 11:17:10.475680 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0543787-88e8-463d-b01b-694ecb854bfa" containerName="extract-utilities" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475693 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0543787-88e8-463d-b01b-694ecb854bfa" containerName="extract-utilities" Mar 17 11:17:10 crc kubenswrapper[4742]: E0317 11:17:10.475704 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1ec505-bd47-4dd4-b03c-6623cc217450" containerName="route-controller-manager" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475711 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1ec505-bd47-4dd4-b03c-6623cc217450" containerName="route-controller-manager" Mar 17 11:17:10 crc kubenswrapper[4742]: E0317 11:17:10.475723 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e4be20-e918-45d7-b026-12ef2abf3462" containerName="extract-utilities" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475728 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e4be20-e918-45d7-b026-12ef2abf3462" containerName="extract-utilities" Mar 17 11:17:10 crc kubenswrapper[4742]: E0317 11:17:10.475735 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0543787-88e8-463d-b01b-694ecb854bfa" containerName="registry-server" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475740 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0543787-88e8-463d-b01b-694ecb854bfa" containerName="registry-server" Mar 17 11:17:10 crc kubenswrapper[4742]: E0317 11:17:10.475748 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" containerName="extract-content" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475753 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" containerName="extract-content" Mar 17 11:17:10 crc kubenswrapper[4742]: E0317 11:17:10.475759 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0543787-88e8-463d-b01b-694ecb854bfa" containerName="extract-content" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475765 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0543787-88e8-463d-b01b-694ecb854bfa" containerName="extract-content" Mar 17 11:17:10 crc kubenswrapper[4742]: E0317 11:17:10.475773 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae906b54-8d5a-416f-b549-94b44e689a47" containerName="controller-manager" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475780 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae906b54-8d5a-416f-b549-94b44e689a47" containerName="controller-manager" Mar 17 11:17:10 crc kubenswrapper[4742]: E0317 11:17:10.475796 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" containerName="extract-utilities" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475802 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" containerName="extract-utilities" Mar 17 11:17:10 crc kubenswrapper[4742]: E0317 11:17:10.475813 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e4be20-e918-45d7-b026-12ef2abf3462" containerName="registry-server" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475819 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e4be20-e918-45d7-b026-12ef2abf3462" containerName="registry-server" Mar 17 11:17:10 crc kubenswrapper[4742]: E0317 11:17:10.475828 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" containerName="registry-server" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475834 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" containerName="registry-server" Mar 17 11:17:10 crc kubenswrapper[4742]: E0317 11:17:10.475844 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e4be20-e918-45d7-b026-12ef2abf3462" containerName="extract-content" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475850 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e4be20-e918-45d7-b026-12ef2abf3462" containerName="extract-content" Mar 17 11:17:10 crc kubenswrapper[4742]: E0317 11:17:10.475858 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b81f5a-30d8-4c88-899b-5effb490bdee" containerName="oauth-openshift" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475863 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b81f5a-30d8-4c88-899b-5effb490bdee" containerName="oauth-openshift" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475963 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae906b54-8d5a-416f-b549-94b44e689a47" containerName="controller-manager" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475975 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d416e1fd-2137-48a4-b933-b25a9ca94a8a" containerName="registry-server" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475983 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8e4be20-e918-45d7-b026-12ef2abf3462" containerName="registry-server" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475993 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1ec505-bd47-4dd4-b03c-6623cc217450" containerName="route-controller-manager" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.475999 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0543787-88e8-463d-b01b-694ecb854bfa" containerName="registry-server" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.476007 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b81f5a-30d8-4c88-899b-5effb490bdee" containerName="oauth-openshift" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.476370 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.479886 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.479892 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.480155 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.480414 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.480794 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.482067 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54574b9498-6qfj8"] Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.482404 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.483368 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.487412 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.487635 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.488139 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.488260 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.488714 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.492886 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.502620 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf"] Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.504537 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.508835 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54574b9498-6qfj8"] Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.518615 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-config\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.518765 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-client-ca\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.518855 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9c46e19-2ab8-454b-852f-35027ca518de-client-ca\") pod \"route-controller-manager-67dff6f899-ktqdf\" (UID: \"f9c46e19-2ab8-454b-852f-35027ca518de\") " pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.518965 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zd5t\" (UniqueName: \"kubernetes.io/projected/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-kube-api-access-9zd5t\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.519033 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-proxy-ca-bundles\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.519073 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-serving-cert\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.519105 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsfxz\" (UniqueName: \"kubernetes.io/projected/f9c46e19-2ab8-454b-852f-35027ca518de-kube-api-access-qsfxz\") pod \"route-controller-manager-67dff6f899-ktqdf\" (UID: \"f9c46e19-2ab8-454b-852f-35027ca518de\") " pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.519213 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9c46e19-2ab8-454b-852f-35027ca518de-config\") pod \"route-controller-manager-67dff6f899-ktqdf\" (UID: \"f9c46e19-2ab8-454b-852f-35027ca518de\") " pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.519260 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c46e19-2ab8-454b-852f-35027ca518de-serving-cert\") pod \"route-controller-manager-67dff6f899-ktqdf\" (UID: \"f9c46e19-2ab8-454b-852f-35027ca518de\") " pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.620641 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-config\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.620697 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-client-ca\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.620737 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9c46e19-2ab8-454b-852f-35027ca518de-client-ca\") pod \"route-controller-manager-67dff6f899-ktqdf\" (UID: \"f9c46e19-2ab8-454b-852f-35027ca518de\") " pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.620770 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zd5t\" (UniqueName: \"kubernetes.io/projected/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-kube-api-access-9zd5t\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.620803 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-proxy-ca-bundles\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.620838 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-serving-cert\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.620863 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsfxz\" (UniqueName: \"kubernetes.io/projected/f9c46e19-2ab8-454b-852f-35027ca518de-kube-api-access-qsfxz\") pod \"route-controller-manager-67dff6f899-ktqdf\" (UID: \"f9c46e19-2ab8-454b-852f-35027ca518de\") " pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.620922 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9c46e19-2ab8-454b-852f-35027ca518de-config\") pod \"route-controller-manager-67dff6f899-ktqdf\" (UID: \"f9c46e19-2ab8-454b-852f-35027ca518de\") " pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.620945 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c46e19-2ab8-454b-852f-35027ca518de-serving-cert\") pod \"route-controller-manager-67dff6f899-ktqdf\" (UID: \"f9c46e19-2ab8-454b-852f-35027ca518de\") " pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.622393 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f9c46e19-2ab8-454b-852f-35027ca518de-client-ca\") pod \"route-controller-manager-67dff6f899-ktqdf\" (UID: \"f9c46e19-2ab8-454b-852f-35027ca518de\") " pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.623025 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-client-ca\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.623431 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-proxy-ca-bundles\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.623706 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9c46e19-2ab8-454b-852f-35027ca518de-config\") pod \"route-controller-manager-67dff6f899-ktqdf\" (UID: \"f9c46e19-2ab8-454b-852f-35027ca518de\") " pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.623794 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-config\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.628102 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-serving-cert\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.629166 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9c46e19-2ab8-454b-852f-35027ca518de-serving-cert\") pod \"route-controller-manager-67dff6f899-ktqdf\" (UID: \"f9c46e19-2ab8-454b-852f-35027ca518de\") " pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.644826 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zd5t\" (UniqueName: \"kubernetes.io/projected/afc4748b-e56f-4cb0-a9ab-659a95e7b97e-kube-api-access-9zd5t\") pod \"controller-manager-54574b9498-6qfj8\" (UID: \"afc4748b-e56f-4cb0-a9ab-659a95e7b97e\") " pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.645657 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsfxz\" (UniqueName: \"kubernetes.io/projected/f9c46e19-2ab8-454b-852f-35027ca518de-kube-api-access-qsfxz\") pod \"route-controller-manager-67dff6f899-ktqdf\" (UID: \"f9c46e19-2ab8-454b-852f-35027ca518de\") " pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.672735 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61b81f5a-30d8-4c88-899b-5effb490bdee" path="/var/lib/kubelet/pods/61b81f5a-30d8-4c88-899b-5effb490bdee/volumes" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.673839 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae906b54-8d5a-416f-b549-94b44e689a47" path="/var/lib/kubelet/pods/ae906b54-8d5a-416f-b549-94b44e689a47/volumes" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.674995 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1ec505-bd47-4dd4-b03c-6623cc217450" path="/var/lib/kubelet/pods/af1ec505-bd47-4dd4-b03c-6623cc217450/volumes" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.815303 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:10 crc kubenswrapper[4742]: I0317 11:17:10.827724 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:11 crc kubenswrapper[4742]: I0317 11:17:11.261497 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf"] Mar 17 11:17:11 crc kubenswrapper[4742]: W0317 11:17:11.279286 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9c46e19_2ab8_454b_852f_35027ca518de.slice/crio-a48ff948b89b8628eeb6e97e10ac0069a248312fccfbd45852c89d5ddd6266bc WatchSource:0}: Error finding container a48ff948b89b8628eeb6e97e10ac0069a248312fccfbd45852c89d5ddd6266bc: Status 404 returned error can't find the container with id a48ff948b89b8628eeb6e97e10ac0069a248312fccfbd45852c89d5ddd6266bc Mar 17 11:17:11 crc kubenswrapper[4742]: I0317 11:17:11.313079 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54574b9498-6qfj8"] Mar 17 11:17:11 crc kubenswrapper[4742]: W0317 11:17:11.320713 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafc4748b_e56f_4cb0_a9ab_659a95e7b97e.slice/crio-7ad7371ff7965eb2c235177bf5db711521c7d1466e974611b0784fe80fa556be WatchSource:0}: Error finding container 7ad7371ff7965eb2c235177bf5db711521c7d1466e974611b0784fe80fa556be: Status 404 returned error can't find the container with id 7ad7371ff7965eb2c235177bf5db711521c7d1466e974611b0784fe80fa556be Mar 17 11:17:11 crc kubenswrapper[4742]: I0317 11:17:11.728256 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" event={"ID":"f9c46e19-2ab8-454b-852f-35027ca518de","Type":"ContainerStarted","Data":"f6fafe1d299a7d121f3f3a5f8abf4fbe780400d815d58a59813dd978b3bed071"} Mar 17 11:17:11 crc kubenswrapper[4742]: I0317 11:17:11.728958 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:11 crc kubenswrapper[4742]: I0317 11:17:11.728992 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" event={"ID":"f9c46e19-2ab8-454b-852f-35027ca518de","Type":"ContainerStarted","Data":"a48ff948b89b8628eeb6e97e10ac0069a248312fccfbd45852c89d5ddd6266bc"} Mar 17 11:17:11 crc kubenswrapper[4742]: I0317 11:17:11.729604 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" event={"ID":"afc4748b-e56f-4cb0-a9ab-659a95e7b97e","Type":"ContainerStarted","Data":"49ca208c3dcb5f75e8284b6e21b4c2d656e1faae98bab338b1c6437a9716fa86"} Mar 17 11:17:11 crc kubenswrapper[4742]: I0317 11:17:11.729652 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" event={"ID":"afc4748b-e56f-4cb0-a9ab-659a95e7b97e","Type":"ContainerStarted","Data":"7ad7371ff7965eb2c235177bf5db711521c7d1466e974611b0784fe80fa556be"} Mar 17 11:17:11 crc kubenswrapper[4742]: I0317 11:17:11.730057 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:11 crc kubenswrapper[4742]: I0317 11:17:11.734042 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" Mar 17 11:17:11 crc kubenswrapper[4742]: I0317 11:17:11.745207 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" podStartSLOduration=2.745191942 podStartE2EDuration="2.745191942s" podCreationTimestamp="2026-03-17 11:17:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:17:11.744886953 +0000 UTC m=+334.871014711" watchObservedRunningTime="2026-03-17 11:17:11.745191942 +0000 UTC m=+334.871319700" Mar 17 11:17:11 crc kubenswrapper[4742]: I0317 11:17:11.764659 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54574b9498-6qfj8" podStartSLOduration=2.764641421 podStartE2EDuration="2.764641421s" podCreationTimestamp="2026-03-17 11:17:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:17:11.763806086 +0000 UTC m=+334.889933844" watchObservedRunningTime="2026-03-17 11:17:11.764641421 +0000 UTC m=+334.890769179" Mar 17 11:17:11 crc kubenswrapper[4742]: I0317 11:17:11.953437 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67dff6f899-ktqdf" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.817874 4742 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.818658 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.818732 4742 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.819344 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd" gracePeriod=15 Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.819425 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873" gracePeriod=15 Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.819415 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7" gracePeriod=15 Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.819504 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe" gracePeriod=15 Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.819453 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae" gracePeriod=15 Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.821895 4742 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 11:17:12 crc kubenswrapper[4742]: E0317 11:17:12.822405 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.822441 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: E0317 11:17:12.822460 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.822474 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 17 11:17:12 crc kubenswrapper[4742]: E0317 11:17:12.822498 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.822514 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: E0317 11:17:12.822535 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.822551 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: E0317 11:17:12.822576 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.822589 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 17 11:17:12 crc kubenswrapper[4742]: E0317 11:17:12.822609 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.822623 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 17 11:17:12 crc kubenswrapper[4742]: E0317 11:17:12.822638 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.822652 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 17 11:17:12 crc kubenswrapper[4742]: E0317 11:17:12.822670 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.822683 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 17 11:17:12 crc kubenswrapper[4742]: E0317 11:17:12.822698 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.822711 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.822945 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.822964 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.822982 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.823000 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.823026 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.823046 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.823064 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.823078 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.823093 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: E0317 11:17:12.823295 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.823312 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.853446 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.853531 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.853572 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.853644 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.853687 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.853713 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.853730 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.853785 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.883268 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954469 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954522 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954574 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954587 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954607 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954645 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954652 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954671 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954702 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954697 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954734 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954677 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954852 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.954987 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.955077 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:12 crc kubenswrapper[4742]: I0317 11:17:12.955100 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.169162 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:17:13 crc kubenswrapper[4742]: E0317 11:17:13.207422 4742 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.230:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d9ccd27f26f9f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:17:13.206271903 +0000 UTC m=+336.332399701,LastTimestamp:2026-03-17 11:17:13.206271903 +0000 UTC m=+336.332399701,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.754625 4742 generic.go:334] "Generic (PLEG): container finished" podID="0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" containerID="748188c35812e90d9fef199e90c431a40e09dc09f4e13ae6bc0fe6f6b755c967" exitCode=0 Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.754725 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6","Type":"ContainerDied","Data":"748188c35812e90d9fef199e90c431a40e09dc09f4e13ae6bc0fe6f6b755c967"} Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.756258 4742 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.757069 4742 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.757557 4742 status_manager.go:851] "Failed to get status for pod" podUID="0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.760331 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.763762 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.765832 4742 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7" exitCode=0 Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.765881 4742 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae" exitCode=0 Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.765893 4742 scope.go:117] "RemoveContainer" containerID="a533c705e5c302ab34992d64a1738a6ca7eadbb583def0ac78dbf7ed25a15200" Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.765902 4742 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873" exitCode=0 Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.766077 4742 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe" exitCode=2 Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.768624 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"73b9d948bfb791cffd4b87224d182f5a6de0f0055887c2f713ff8bd36b6e9e9d"} Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.768695 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"86541412e1370e1c75fc00194a5d4d581e5cbce6252bbe23e912415dcb3c4529"} Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.769635 4742 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.770127 4742 status_manager.go:851] "Failed to get status for pod" podUID="0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:13 crc kubenswrapper[4742]: I0317 11:17:13.770590 4742 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:14 crc kubenswrapper[4742]: E0317 11:17:14.455868 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:17:14Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:17:14Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:17:14Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:17:14Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:14 crc kubenswrapper[4742]: E0317 11:17:14.456179 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:14 crc kubenswrapper[4742]: E0317 11:17:14.456405 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:14 crc kubenswrapper[4742]: E0317 11:17:14.456643 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:14 crc kubenswrapper[4742]: E0317 11:17:14.456941 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:14 crc kubenswrapper[4742]: E0317 11:17:14.456962 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:17:14 crc kubenswrapper[4742]: I0317 11:17:14.784202 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.322114 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.323834 4742 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.324465 4742 status_manager.go:851] "Failed to get status for pod" podUID="0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.336741 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.339414 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.340177 4742 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.340717 4742 status_manager.go:851] "Failed to get status for pod" podUID="0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.341330 4742 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394211 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-kube-api-access\") pod \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\" (UID: \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\") " Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394282 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394309 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394330 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-kubelet-dir\") pod \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\" (UID: \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\") " Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394358 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394400 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-var-lock\") pod \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\" (UID: \"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6\") " Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394401 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394439 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" (UID: "0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394456 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394443 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394549 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-var-lock" (OuterVolumeSpecName: "var-lock") pod "0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" (UID: "0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394618 4742 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394631 4742 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394640 4742 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394649 4742 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.394657 4742 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-var-lock\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.400157 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" (UID: "0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.495707 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.798657 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.800444 4742 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd" exitCode=0 Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.800595 4742 scope.go:117] "RemoveContainer" containerID="2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.800655 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.804096 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6","Type":"ContainerDied","Data":"62fc9caf74e9cf002e89804c637b030227182227b79e2dcd33987b804d3783eb"} Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.804158 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62fc9caf74e9cf002e89804c637b030227182227b79e2dcd33987b804d3783eb" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.804163 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.828741 4742 scope.go:117] "RemoveContainer" containerID="1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.829849 4742 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.830388 4742 status_manager.go:851] "Failed to get status for pod" podUID="0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.830734 4742 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.831286 4742 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.831796 4742 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.832530 4742 status_manager.go:851] "Failed to get status for pod" podUID="0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.850209 4742 scope.go:117] "RemoveContainer" containerID="b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.871647 4742 scope.go:117] "RemoveContainer" containerID="45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.898031 4742 scope.go:117] "RemoveContainer" containerID="92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.926439 4742 scope.go:117] "RemoveContainer" containerID="bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.955939 4742 scope.go:117] "RemoveContainer" containerID="2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7" Mar 17 11:17:15 crc kubenswrapper[4742]: E0317 11:17:15.956967 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\": container with ID starting with 2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7 not found: ID does not exist" containerID="2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.957039 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7"} err="failed to get container status \"2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\": rpc error: code = NotFound desc = could not find container \"2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7\": container with ID starting with 2a47f1359a0ee8250dd64aa9cef61d6c7ba74649081f28c4a691424069fdf6e7 not found: ID does not exist" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.957078 4742 scope.go:117] "RemoveContainer" containerID="1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae" Mar 17 11:17:15 crc kubenswrapper[4742]: E0317 11:17:15.958104 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\": container with ID starting with 1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae not found: ID does not exist" containerID="1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.958139 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae"} err="failed to get container status \"1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\": rpc error: code = NotFound desc = could not find container \"1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae\": container with ID starting with 1fa451296db1f0fc4f24cb9600ff0732673ad40b6a31c928ce471035c410fcae not found: ID does not exist" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.958166 4742 scope.go:117] "RemoveContainer" containerID="b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873" Mar 17 11:17:15 crc kubenswrapper[4742]: E0317 11:17:15.958788 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\": container with ID starting with b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873 not found: ID does not exist" containerID="b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.958834 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873"} err="failed to get container status \"b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\": rpc error: code = NotFound desc = could not find container \"b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873\": container with ID starting with b8041d4e1e0bd9c47751726f5b548528cba1194fad701381f72e131bb581c873 not found: ID does not exist" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.958864 4742 scope.go:117] "RemoveContainer" containerID="45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe" Mar 17 11:17:15 crc kubenswrapper[4742]: E0317 11:17:15.959451 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\": container with ID starting with 45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe not found: ID does not exist" containerID="45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.959494 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe"} err="failed to get container status \"45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\": rpc error: code = NotFound desc = could not find container \"45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe\": container with ID starting with 45b309629a6595d77522c7076cca64d557fe02c6e3089a99e3ab94143ed982fe not found: ID does not exist" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.959520 4742 scope.go:117] "RemoveContainer" containerID="92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd" Mar 17 11:17:15 crc kubenswrapper[4742]: E0317 11:17:15.960200 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\": container with ID starting with 92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd not found: ID does not exist" containerID="92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.960239 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd"} err="failed to get container status \"92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\": rpc error: code = NotFound desc = could not find container \"92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd\": container with ID starting with 92f58fc10756645ef335da4b1ce6373ad385efe7a46159a63e82ee103a149fcd not found: ID does not exist" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.960260 4742 scope.go:117] "RemoveContainer" containerID="bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e" Mar 17 11:17:15 crc kubenswrapper[4742]: E0317 11:17:15.960559 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\": container with ID starting with bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e not found: ID does not exist" containerID="bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e" Mar 17 11:17:15 crc kubenswrapper[4742]: I0317 11:17:15.960608 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e"} err="failed to get container status \"bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\": rpc error: code = NotFound desc = could not find container \"bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e\": container with ID starting with bb610662d5f79fc27293e0c3611ec9ad7634c7fb4a6dd70a9f9e84bfe1c71c5e not found: ID does not exist" Mar 17 11:17:16 crc kubenswrapper[4742]: I0317 11:17:16.670373 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 17 11:17:18 crc kubenswrapper[4742]: I0317 11:17:18.668635 4742 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:18 crc kubenswrapper[4742]: I0317 11:17:18.668983 4742 status_manager.go:851] "Failed to get status for pod" podUID="0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:18 crc kubenswrapper[4742]: E0317 11:17:18.712236 4742 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.230:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d9ccd27f26f9f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-17 11:17:13.206271903 +0000 UTC m=+336.332399701,LastTimestamp:2026-03-17 11:17:13.206271903 +0000 UTC m=+336.332399701,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 17 11:17:22 crc kubenswrapper[4742]: E0317 11:17:22.645875 4742 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:22 crc kubenswrapper[4742]: E0317 11:17:22.646882 4742 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:22 crc kubenswrapper[4742]: E0317 11:17:22.647491 4742 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:22 crc kubenswrapper[4742]: E0317 11:17:22.648062 4742 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:22 crc kubenswrapper[4742]: E0317 11:17:22.648550 4742 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:22 crc kubenswrapper[4742]: I0317 11:17:22.648620 4742 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 17 11:17:22 crc kubenswrapper[4742]: E0317 11:17:22.649323 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="200ms" Mar 17 11:17:22 crc kubenswrapper[4742]: E0317 11:17:22.851041 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="400ms" Mar 17 11:17:23 crc kubenswrapper[4742]: E0317 11:17:23.251877 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="800ms" Mar 17 11:17:23 crc kubenswrapper[4742]: I0317 11:17:23.662559 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:23 crc kubenswrapper[4742]: I0317 11:17:23.663479 4742 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:23 crc kubenswrapper[4742]: I0317 11:17:23.663991 4742 status_manager.go:851] "Failed to get status for pod" podUID="0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:23 crc kubenswrapper[4742]: I0317 11:17:23.687443 4742 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0631e65b-dd02-40a7-8d35-2e4c66b70cd0" Mar 17 11:17:23 crc kubenswrapper[4742]: I0317 11:17:23.687744 4742 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0631e65b-dd02-40a7-8d35-2e4c66b70cd0" Mar 17 11:17:23 crc kubenswrapper[4742]: E0317 11:17:23.688277 4742 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:23 crc kubenswrapper[4742]: I0317 11:17:23.688975 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:23 crc kubenswrapper[4742]: W0317 11:17:23.736042 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-8487d3b8ebacec61c649a0e27fa62a5957d2709f47c70cc57c61ba03b2e552f4 WatchSource:0}: Error finding container 8487d3b8ebacec61c649a0e27fa62a5957d2709f47c70cc57c61ba03b2e552f4: Status 404 returned error can't find the container with id 8487d3b8ebacec61c649a0e27fa62a5957d2709f47c70cc57c61ba03b2e552f4 Mar 17 11:17:23 crc kubenswrapper[4742]: I0317 11:17:23.870650 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8487d3b8ebacec61c649a0e27fa62a5957d2709f47c70cc57c61ba03b2e552f4"} Mar 17 11:17:24 crc kubenswrapper[4742]: E0317 11:17:24.052867 4742 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" interval="1.6s" Mar 17 11:17:24 crc kubenswrapper[4742]: E0317 11:17:24.832744 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:17:24Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:17:24Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:17:24Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-17T11:17:24Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:24 crc kubenswrapper[4742]: E0317 11:17:24.834695 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:24 crc kubenswrapper[4742]: E0317 11:17:24.835271 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:24 crc kubenswrapper[4742]: E0317 11:17:24.835821 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:24 crc kubenswrapper[4742]: E0317 11:17:24.836458 4742 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:24 crc kubenswrapper[4742]: E0317 11:17:24.836495 4742 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 17 11:17:24 crc kubenswrapper[4742]: I0317 11:17:24.884053 4742 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="01b210261945555d65ea5894fc7d553319ccea70bef98e2e6e29f661d32357d4" exitCode=0 Mar 17 11:17:24 crc kubenswrapper[4742]: I0317 11:17:24.884414 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"01b210261945555d65ea5894fc7d553319ccea70bef98e2e6e29f661d32357d4"} Mar 17 11:17:24 crc kubenswrapper[4742]: I0317 11:17:24.884534 4742 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0631e65b-dd02-40a7-8d35-2e4c66b70cd0" Mar 17 11:17:24 crc kubenswrapper[4742]: I0317 11:17:24.884741 4742 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0631e65b-dd02-40a7-8d35-2e4c66b70cd0" Mar 17 11:17:24 crc kubenswrapper[4742]: I0317 11:17:24.885102 4742 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:24 crc kubenswrapper[4742]: E0317 11:17:24.885521 4742 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:24 crc kubenswrapper[4742]: I0317 11:17:24.885606 4742 status_manager.go:851] "Failed to get status for pod" podUID="0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.230:6443: connect: connection refused" Mar 17 11:17:25 crc kubenswrapper[4742]: I0317 11:17:25.915986 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3b748b553495ec25a5b824b3c14f882fe980505f2be70ec08e88877bfb1c25bb"} Mar 17 11:17:25 crc kubenswrapper[4742]: I0317 11:17:25.916455 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f0d74cbb2835d66ce5fc54620bda94f6964e80f52c28b3035277e15596ac4766"} Mar 17 11:17:25 crc kubenswrapper[4742]: I0317 11:17:25.916476 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a15053d999a0fae2c49ee73feccb4bb33282fd306681c7392c7617fbdfee7fa1"} Mar 17 11:17:25 crc kubenswrapper[4742]: I0317 11:17:25.922616 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 17 11:17:25 crc kubenswrapper[4742]: I0317 11:17:25.924190 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 17 11:17:25 crc kubenswrapper[4742]: I0317 11:17:25.924238 4742 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249" exitCode=1 Mar 17 11:17:25 crc kubenswrapper[4742]: I0317 11:17:25.924272 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249"} Mar 17 11:17:25 crc kubenswrapper[4742]: I0317 11:17:25.924674 4742 scope.go:117] "RemoveContainer" containerID="f30c24d97c9524fad5a195f249e664ea02183bdf272a5cf4c18ca8ca92847249" Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.284564 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.291971 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.293488 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.293589 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"51809027b61cfcd77b3dc3b653d24f819c214c405acec3eb2cf66c8b2de83fa0"} Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.302668 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"705bee1cbaf54b5ff291cfe2150895803c62b780cc525f4ffb42e0ed74227c09"} Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.302698 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"03d1d0bbfb8b926f6a3bbea567dd524360ea9d9d02eb1af87d43fadf2d642909"} Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.302887 4742 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0631e65b-dd02-40a7-8d35-2e4c66b70cd0" Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.302899 4742 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0631e65b-dd02-40a7-8d35-2e4c66b70cd0" Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.302990 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.676826 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.676991 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.677050 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.677127 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.930756 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:17:27 crc kubenswrapper[4742]: I0317 11:17:27.939453 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:17:28 crc kubenswrapper[4742]: I0317 11:17:28.309215 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:17:28 crc kubenswrapper[4742]: E0317 11:17:28.677743 4742 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 17 11:17:28 crc kubenswrapper[4742]: E0317 11:17:28.677768 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 17 11:17:28 crc kubenswrapper[4742]: E0317 11:17:28.677836 4742 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 17 11:17:28 crc kubenswrapper[4742]: E0317 11:17:28.677859 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:19:30.677829701 +0000 UTC m=+473.803957469 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 17 11:17:28 crc kubenswrapper[4742]: E0317 11:17:28.678302 4742 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 17 11:17:28 crc kubenswrapper[4742]: E0317 11:17:28.678460 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-17 11:19:30.678437809 +0000 UTC m=+473.804565577 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 17 11:17:28 crc kubenswrapper[4742]: I0317 11:17:28.680298 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 17 11:17:28 crc kubenswrapper[4742]: E0317 11:17:28.688563 4742 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 17 11:17:28 crc kubenswrapper[4742]: E0317 11:17:28.688757 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-17 11:19:30.68873279 +0000 UTC m=+473.814860558 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 17 11:17:28 crc kubenswrapper[4742]: E0317 11:17:28.688625 4742 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 17 11:17:28 crc kubenswrapper[4742]: E0317 11:17:28.689008 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-17 11:19:30.688996178 +0000 UTC m=+473.815123956 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 17 11:17:28 crc kubenswrapper[4742]: I0317 11:17:28.689131 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:28 crc kubenswrapper[4742]: I0317 11:17:28.689168 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:28 crc kubenswrapper[4742]: I0317 11:17:28.695209 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:32 crc kubenswrapper[4742]: I0317 11:17:32.315017 4742 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:32 crc kubenswrapper[4742]: I0317 11:17:32.348543 4742 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0631e65b-dd02-40a7-8d35-2e4c66b70cd0" Mar 17 11:17:32 crc kubenswrapper[4742]: I0317 11:17:32.348590 4742 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0631e65b-dd02-40a7-8d35-2e4c66b70cd0" Mar 17 11:17:32 crc kubenswrapper[4742]: I0317 11:17:32.355003 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:32 crc kubenswrapper[4742]: I0317 11:17:32.358709 4742 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a3f10da4-9646-4ce0-b85f-89f323b9c67d" Mar 17 11:17:32 crc kubenswrapper[4742]: I0317 11:17:32.682322 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 17 11:17:32 crc kubenswrapper[4742]: I0317 11:17:32.682455 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 17 11:17:32 crc kubenswrapper[4742]: I0317 11:17:32.684225 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 17 11:17:33 crc kubenswrapper[4742]: I0317 11:17:33.355209 4742 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0631e65b-dd02-40a7-8d35-2e4c66b70cd0" Mar 17 11:17:33 crc kubenswrapper[4742]: I0317 11:17:33.355257 4742 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0631e65b-dd02-40a7-8d35-2e4c66b70cd0" Mar 17 11:17:36 crc kubenswrapper[4742]: I0317 11:17:36.599595 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:17:36 crc kubenswrapper[4742]: I0317 11:17:36.602417 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 17 11:17:36 crc kubenswrapper[4742]: I0317 11:17:36.620572 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14-metrics-certs\") pod \"network-metrics-daemon-drnx8\" (UID: \"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14\") " pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:17:36 crc kubenswrapper[4742]: I0317 11:17:36.689044 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 17 11:17:36 crc kubenswrapper[4742]: I0317 11:17:36.697283 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drnx8" Mar 17 11:17:37 crc kubenswrapper[4742]: I0317 11:17:37.123217 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 17 11:17:37 crc kubenswrapper[4742]: W0317 11:17:37.156264 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bea5e2a_bd64_4f05_a257_c8b9c9ca0d14.slice/crio-468b91542214da3487aa3057d508fd9708c88bcf02e67b78d56551a892942930 WatchSource:0}: Error finding container 468b91542214da3487aa3057d508fd9708c88bcf02e67b78d56551a892942930: Status 404 returned error can't find the container with id 468b91542214da3487aa3057d508fd9708c88bcf02e67b78d56551a892942930 Mar 17 11:17:37 crc kubenswrapper[4742]: I0317 11:17:37.389980 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-drnx8" event={"ID":"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14","Type":"ContainerStarted","Data":"468b91542214da3487aa3057d508fd9708c88bcf02e67b78d56551a892942930"} Mar 17 11:17:38 crc kubenswrapper[4742]: I0317 11:17:38.029206 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 17 11:17:38 crc kubenswrapper[4742]: I0317 11:17:38.400045 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-drnx8" event={"ID":"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14","Type":"ContainerStarted","Data":"0c16c67959a88e740cd984bd2327731a255ef940931e287eb4ee708899f7028c"} Mar 17 11:17:38 crc kubenswrapper[4742]: I0317 11:17:38.400108 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-drnx8" event={"ID":"6bea5e2a-bd64-4f05-a257-c8b9c9ca0d14","Type":"ContainerStarted","Data":"3cd282c87850a1bc341862b472274a93bbe004d92537faf64728161ee2867dd4"} Mar 17 11:17:38 crc kubenswrapper[4742]: I0317 11:17:38.455035 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 17 11:17:38 crc kubenswrapper[4742]: I0317 11:17:38.696202 4742 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a3f10da4-9646-4ce0-b85f-89f323b9c67d" Mar 17 11:17:39 crc kubenswrapper[4742]: I0317 11:17:39.698889 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 17 11:17:39 crc kubenswrapper[4742]: I0317 11:17:39.955325 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 17 11:17:40 crc kubenswrapper[4742]: I0317 11:17:40.033650 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 17 11:17:42 crc kubenswrapper[4742]: E0317 11:17:42.702528 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 17 11:17:42 crc kubenswrapper[4742]: E0317 11:17:42.716541 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 17 11:17:42 crc kubenswrapper[4742]: E0317 11:17:42.729068 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 17 11:17:43 crc kubenswrapper[4742]: I0317 11:17:43.728431 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 17 11:17:43 crc kubenswrapper[4742]: I0317 11:17:43.844666 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 17 11:17:44 crc kubenswrapper[4742]: I0317 11:17:44.209573 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 17 11:17:45 crc kubenswrapper[4742]: I0317 11:17:45.014433 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 17 11:17:45 crc kubenswrapper[4742]: I0317 11:17:45.055221 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 17 11:17:45 crc kubenswrapper[4742]: I0317 11:17:45.196436 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 17 11:17:45 crc kubenswrapper[4742]: I0317 11:17:45.353016 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 17 11:17:45 crc kubenswrapper[4742]: I0317 11:17:45.506071 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 17 11:17:45 crc kubenswrapper[4742]: I0317 11:17:45.508100 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 17 11:17:45 crc kubenswrapper[4742]: I0317 11:17:45.652441 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 17 11:17:45 crc kubenswrapper[4742]: I0317 11:17:45.698562 4742 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 17 11:17:45 crc kubenswrapper[4742]: I0317 11:17:45.715704 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 17 11:17:45 crc kubenswrapper[4742]: I0317 11:17:45.754290 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 17 11:17:45 crc kubenswrapper[4742]: I0317 11:17:45.964418 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.203230 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.413345 4742 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.415709 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-drnx8" podStartSLOduration=302.415683672 podStartE2EDuration="5m2.415683672s" podCreationTimestamp="2026-03-17 11:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:17:38.423697702 +0000 UTC m=+361.549825490" watchObservedRunningTime="2026-03-17 11:17:46.415683672 +0000 UTC m=+369.541811470" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.419357 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=34.419345879 podStartE2EDuration="34.419345879s" podCreationTimestamp="2026-03-17 11:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:17:31.612045464 +0000 UTC m=+354.738173232" watchObservedRunningTime="2026-03-17 11:17:46.419345879 +0000 UTC m=+369.545473667" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.421029 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.421082 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb"] Mar 17 11:17:46 crc kubenswrapper[4742]: E0317 11:17:46.421352 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" containerName="installer" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.421380 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" containerName="installer" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.421780 4742 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0631e65b-dd02-40a7-8d35-2e4c66b70cd0" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.421842 4742 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0631e65b-dd02-40a7-8d35-2e4c66b70cd0" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.421807 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fcbfeb4-d9c8-4004-b86e-191d6c8e12c6" containerName="installer" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.422532 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-drnx8"] Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.422704 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.424568 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.425810 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.426313 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.426455 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.428161 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.428285 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.428293 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.428444 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.428822 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.429182 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.432053 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.432810 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.433189 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.443324 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.453022 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.453236 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.453219421 podStartE2EDuration="14.453219421s" podCreationTimestamp="2026-03-17 11:17:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:17:46.449567423 +0000 UTC m=+369.575695181" watchObservedRunningTime="2026-03-17 11:17:46.453219421 +0000 UTC m=+369.579347179" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.456765 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.542219 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-user-template-login\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.542625 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-service-ca\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.542830 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.543110 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-user-template-error\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.543297 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdc92\" (UniqueName: \"kubernetes.io/projected/fbce34aa-8ab7-47cd-adf0-56b00e185629-kube-api-access-vdc92\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.543476 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fbce34aa-8ab7-47cd-adf0-56b00e185629-audit-policies\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.543735 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.543973 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.544205 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-session\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.544407 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.544592 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.544794 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-router-certs\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.545034 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.545251 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fbce34aa-8ab7-47cd-adf0-56b00e185629-audit-dir\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.646848 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-router-certs\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.647104 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.647209 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fbce34aa-8ab7-47cd-adf0-56b00e185629-audit-dir\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.647294 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-user-template-login\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.647361 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-service-ca\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.647378 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fbce34aa-8ab7-47cd-adf0-56b00e185629-audit-dir\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.647421 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.647488 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-user-template-error\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.647542 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdc92\" (UniqueName: \"kubernetes.io/projected/fbce34aa-8ab7-47cd-adf0-56b00e185629-kube-api-access-vdc92\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.647596 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fbce34aa-8ab7-47cd-adf0-56b00e185629-audit-policies\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.647692 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.647761 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.647858 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-session\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.647975 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.648018 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.648557 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-cliconfig\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.648567 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-service-ca\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.650988 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.652786 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fbce34aa-8ab7-47cd-adf0-56b00e185629-audit-policies\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.654519 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-router-certs\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.654615 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.655386 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-user-template-login\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.655805 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.663726 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.667153 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-session\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.669508 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-system-serving-cert\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.670428 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fbce34aa-8ab7-47cd-adf0-56b00e185629-v4-0-config-user-template-error\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.679065 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdc92\" (UniqueName: \"kubernetes.io/projected/fbce34aa-8ab7-47cd-adf0-56b00e185629-kube-api-access-vdc92\") pod \"oauth-openshift-fc8b9c7b8-tsqxb\" (UID: \"fbce34aa-8ab7-47cd-adf0-56b00e185629\") " pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.758414 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.885473 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 17 11:17:46 crc kubenswrapper[4742]: I0317 11:17:46.903887 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 17 11:17:47 crc kubenswrapper[4742]: I0317 11:17:47.002434 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 17 11:17:47 crc kubenswrapper[4742]: I0317 11:17:47.077048 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 17 11:17:47 crc kubenswrapper[4742]: I0317 11:17:47.105300 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 17 11:17:47 crc kubenswrapper[4742]: I0317 11:17:47.177756 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 17 11:17:47 crc kubenswrapper[4742]: I0317 11:17:47.328577 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 17 11:17:47 crc kubenswrapper[4742]: I0317 11:17:47.407094 4742 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 17 11:17:47 crc kubenswrapper[4742]: I0317 11:17:47.447645 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 17 11:17:47 crc kubenswrapper[4742]: I0317 11:17:47.649724 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 17 11:17:47 crc kubenswrapper[4742]: I0317 11:17:47.724852 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 17 11:17:47 crc kubenswrapper[4742]: I0317 11:17:47.758501 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 17 11:17:47 crc kubenswrapper[4742]: I0317 11:17:47.826631 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 17 11:17:47 crc kubenswrapper[4742]: I0317 11:17:47.936701 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 17 11:17:47 crc kubenswrapper[4742]: I0317 11:17:47.980465 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.129785 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.223144 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.259845 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.283186 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.345815 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.385113 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.414215 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.526847 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.570077 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.661258 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.866193 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.902078 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.910520 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.911882 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.972697 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 17 11:17:48 crc kubenswrapper[4742]: I0317 11:17:48.997364 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.045539 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.153348 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.156096 4742 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.254203 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.365443 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.383051 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.412242 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.499264 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.592474 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.640837 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.648505 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.692944 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.831860 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.842095 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.945222 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.955241 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 17 11:17:49 crc kubenswrapper[4742]: I0317 11:17:49.965420 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.075930 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.099303 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.151781 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.192869 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.230874 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.396949 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.472488 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.552781 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.577455 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.582287 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.589272 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.606518 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.620273 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.632178 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.679474 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.688854 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.823062 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.856359 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 17 11:17:50 crc kubenswrapper[4742]: I0317 11:17:50.874023 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.009597 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.014628 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.064252 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.080676 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.119464 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.148122 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.156541 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.167112 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.214093 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.219460 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.257654 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.288210 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.328152 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.380364 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.396971 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.407327 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.488661 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.546373 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.598267 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.606253 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.662377 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.808010 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.812095 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.879640 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 17 11:17:51 crc kubenswrapper[4742]: I0317 11:17:51.963755 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.101595 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.147320 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.206138 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.236248 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.257698 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.270000 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.270289 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.293438 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.346558 4742 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.375680 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.453968 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.510201 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.527413 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.598279 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.809781 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 17 11:17:52 crc kubenswrapper[4742]: I0317 11:17:52.981315 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.042582 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.082419 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.143660 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.182391 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.191737 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.237634 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.428541 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.429630 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.466666 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.481752 4742 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.504011 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.635787 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.645069 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.662564 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.698944 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.718143 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.851250 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.912460 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 17 11:17:53 crc kubenswrapper[4742]: I0317 11:17:53.928739 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.021891 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.097796 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.105399 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.277269 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.282004 4742 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.282418 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://73b9d948bfb791cffd4b87224d182f5a6de0f0055887c2f713ff8bd36b6e9e9d" gracePeriod=5 Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.367820 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.467407 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.469683 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.507004 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.541792 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.560503 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.571749 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.645858 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.651775 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.716253 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.717346 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.754570 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.759350 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.767863 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.887404 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 17 11:17:54 crc kubenswrapper[4742]: I0317 11:17:54.933321 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.016223 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.070718 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.104297 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.116124 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.155398 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.204141 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.405118 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.412859 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.591670 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.608090 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.753040 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.783735 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.868661 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 17 11:17:55 crc kubenswrapper[4742]: I0317 11:17:55.985525 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.024489 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.036018 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.066808 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.168390 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.505220 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.577829 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.581020 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.620003 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.662673 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.662746 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.695052 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.777802 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.779800 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.805030 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.857615 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 17 11:17:56 crc kubenswrapper[4742]: I0317 11:17:56.888045 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 17 11:17:57 crc kubenswrapper[4742]: I0317 11:17:57.364483 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 17 11:17:57 crc kubenswrapper[4742]: I0317 11:17:57.384896 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 17 11:17:57 crc kubenswrapper[4742]: I0317 11:17:57.399599 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 17 11:17:57 crc kubenswrapper[4742]: I0317 11:17:57.415178 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 17 11:17:57 crc kubenswrapper[4742]: I0317 11:17:57.446741 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 17 11:17:57 crc kubenswrapper[4742]: I0317 11:17:57.451280 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 17 11:17:57 crc kubenswrapper[4742]: I0317 11:17:57.504540 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 17 11:17:57 crc kubenswrapper[4742]: I0317 11:17:57.537896 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 17 11:17:57 crc kubenswrapper[4742]: I0317 11:17:57.618406 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 17 11:17:57 crc kubenswrapper[4742]: I0317 11:17:57.626781 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 17 11:17:57 crc kubenswrapper[4742]: I0317 11:17:57.732792 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 17 11:17:57 crc kubenswrapper[4742]: I0317 11:17:57.841492 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 17 11:17:57 crc kubenswrapper[4742]: I0317 11:17:57.930284 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 17 11:17:58 crc kubenswrapper[4742]: I0317 11:17:58.094358 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 17 11:17:58 crc kubenswrapper[4742]: I0317 11:17:58.124434 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 17 11:17:58 crc kubenswrapper[4742]: I0317 11:17:58.210781 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 17 11:17:58 crc kubenswrapper[4742]: I0317 11:17:58.533378 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 17 11:17:58 crc kubenswrapper[4742]: I0317 11:17:58.669919 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 17 11:17:58 crc kubenswrapper[4742]: I0317 11:17:58.766149 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 17 11:17:58 crc kubenswrapper[4742]: I0317 11:17:58.834436 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 17 11:17:58 crc kubenswrapper[4742]: I0317 11:17:58.883325 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 17 11:17:58 crc kubenswrapper[4742]: I0317 11:17:58.965495 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 17 11:17:59 crc kubenswrapper[4742]: I0317 11:17:59.220064 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 17 11:17:59 crc kubenswrapper[4742]: I0317 11:17:59.269580 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 17 11:17:59 crc kubenswrapper[4742]: I0317 11:17:59.405864 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 17 11:17:59 crc kubenswrapper[4742]: I0317 11:17:59.458428 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 17 11:17:59 crc kubenswrapper[4742]: I0317 11:17:59.493308 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 17 11:17:59 crc kubenswrapper[4742]: I0317 11:17:59.505370 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 17 11:17:59 crc kubenswrapper[4742]: I0317 11:17:59.551719 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 17 11:17:59 crc kubenswrapper[4742]: I0317 11:17:59.551784 4742 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="73b9d948bfb791cffd4b87224d182f5a6de0f0055887c2f713ff8bd36b6e9e9d" exitCode=137 Mar 17 11:17:59 crc kubenswrapper[4742]: I0317 11:17:59.566220 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 17 11:17:59 crc kubenswrapper[4742]: I0317 11:17:59.575415 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 17 11:17:59 crc kubenswrapper[4742]: I0317 11:17:59.667767 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 17 11:17:59 crc kubenswrapper[4742]: I0317 11:17:59.875620 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 17 11:17:59 crc kubenswrapper[4742]: I0317 11:17:59.875678 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.037355 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.037416 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.037436 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.037473 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.037488 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.037482 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.037507 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.037565 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.037585 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.037774 4742 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.037788 4742 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.037796 4742 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.037804 4742 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.045945 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.049979 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.139503 4742 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.483612 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.559772 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.559892 4742 scope.go:117] "RemoveContainer" containerID="73b9d948bfb791cffd4b87224d182f5a6de0f0055887c2f713ff8bd36b6e9e9d" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.560149 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.591159 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562438-9522d"] Mar 17 11:18:00 crc kubenswrapper[4742]: E0317 11:18:00.591398 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.591421 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.591539 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.591963 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562438-9522d" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.593516 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.594021 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.595001 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.617780 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.686965 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.687393 4742 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.689125 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.701655 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.701705 4742 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="b3c9747e-89ad-4860-a2c7-024e2f099169" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.707496 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.707546 4742 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="b3c9747e-89ad-4860-a2c7-024e2f099169" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.747585 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x8j4\" (UniqueName: \"kubernetes.io/projected/e08745d1-ba86-4a70-a6f0-c8108edb08b7-kube-api-access-6x8j4\") pod \"auto-csr-approver-29562438-9522d\" (UID: \"e08745d1-ba86-4a70-a6f0-c8108edb08b7\") " pod="openshift-infra/auto-csr-approver-29562438-9522d" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.849241 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x8j4\" (UniqueName: \"kubernetes.io/projected/e08745d1-ba86-4a70-a6f0-c8108edb08b7-kube-api-access-6x8j4\") pod \"auto-csr-approver-29562438-9522d\" (UID: \"e08745d1-ba86-4a70-a6f0-c8108edb08b7\") " pod="openshift-infra/auto-csr-approver-29562438-9522d" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.880076 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562438-9522d"] Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.883106 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x8j4\" (UniqueName: \"kubernetes.io/projected/e08745d1-ba86-4a70-a6f0-c8108edb08b7-kube-api-access-6x8j4\") pod \"auto-csr-approver-29562438-9522d\" (UID: \"e08745d1-ba86-4a70-a6f0-c8108edb08b7\") " pod="openshift-infra/auto-csr-approver-29562438-9522d" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.901564 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb"] Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.918979 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562438-9522d" Mar 17 11:18:00 crc kubenswrapper[4742]: I0317 11:18:00.935284 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 17 11:18:01 crc kubenswrapper[4742]: I0317 11:18:01.348116 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb"] Mar 17 11:18:01 crc kubenswrapper[4742]: I0317 11:18:01.390344 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562438-9522d"] Mar 17 11:18:01 crc kubenswrapper[4742]: W0317 11:18:01.421218 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode08745d1_ba86_4a70_a6f0_c8108edb08b7.slice/crio-88a269b3beb64c45fa5dbb5b1e71a9ed1228439731fc04f287a5c17338fbcd8f WatchSource:0}: Error finding container 88a269b3beb64c45fa5dbb5b1e71a9ed1228439731fc04f287a5c17338fbcd8f: Status 404 returned error can't find the container with id 88a269b3beb64c45fa5dbb5b1e71a9ed1228439731fc04f287a5c17338fbcd8f Mar 17 11:18:01 crc kubenswrapper[4742]: I0317 11:18:01.576786 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" event={"ID":"fbce34aa-8ab7-47cd-adf0-56b00e185629","Type":"ContainerStarted","Data":"46628a3c4eecd062bc7ee26a91cea61a32451d6a524125cc390f2b9f3275ce71"} Mar 17 11:18:01 crc kubenswrapper[4742]: I0317 11:18:01.579025 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562438-9522d" event={"ID":"e08745d1-ba86-4a70-a6f0-c8108edb08b7","Type":"ContainerStarted","Data":"88a269b3beb64c45fa5dbb5b1e71a9ed1228439731fc04f287a5c17338fbcd8f"} Mar 17 11:18:02 crc kubenswrapper[4742]: I0317 11:18:02.589014 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" event={"ID":"fbce34aa-8ab7-47cd-adf0-56b00e185629","Type":"ContainerStarted","Data":"d2418890e7d19eaf9e305be230e881755cea3281602fe280f5ba470aed4d1bdf"} Mar 17 11:18:02 crc kubenswrapper[4742]: I0317 11:18:02.589685 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:18:02 crc kubenswrapper[4742]: I0317 11:18:02.598162 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" Mar 17 11:18:02 crc kubenswrapper[4742]: I0317 11:18:02.627745 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-fc8b9c7b8-tsqxb" podStartSLOduration=78.627714568 podStartE2EDuration="1m18.627714568s" podCreationTimestamp="2026-03-17 11:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:18:02.621731653 +0000 UTC m=+385.747859471" watchObservedRunningTime="2026-03-17 11:18:02.627714568 +0000 UTC m=+385.753842356" Mar 17 11:18:03 crc kubenswrapper[4742]: I0317 11:18:03.597333 4742 generic.go:334] "Generic (PLEG): container finished" podID="e08745d1-ba86-4a70-a6f0-c8108edb08b7" containerID="608de52b8994835e4794de9795c9480d78922f9e81dfca4b2b4d7d4393551adc" exitCode=0 Mar 17 11:18:03 crc kubenswrapper[4742]: I0317 11:18:03.597423 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562438-9522d" event={"ID":"e08745d1-ba86-4a70-a6f0-c8108edb08b7","Type":"ContainerDied","Data":"608de52b8994835e4794de9795c9480d78922f9e81dfca4b2b4d7d4393551adc"} Mar 17 11:18:04 crc kubenswrapper[4742]: I0317 11:18:04.971540 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562438-9522d" Mar 17 11:18:05 crc kubenswrapper[4742]: I0317 11:18:05.107526 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x8j4\" (UniqueName: \"kubernetes.io/projected/e08745d1-ba86-4a70-a6f0-c8108edb08b7-kube-api-access-6x8j4\") pod \"e08745d1-ba86-4a70-a6f0-c8108edb08b7\" (UID: \"e08745d1-ba86-4a70-a6f0-c8108edb08b7\") " Mar 17 11:18:05 crc kubenswrapper[4742]: I0317 11:18:05.112089 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e08745d1-ba86-4a70-a6f0-c8108edb08b7-kube-api-access-6x8j4" (OuterVolumeSpecName: "kube-api-access-6x8j4") pod "e08745d1-ba86-4a70-a6f0-c8108edb08b7" (UID: "e08745d1-ba86-4a70-a6f0-c8108edb08b7"). InnerVolumeSpecName "kube-api-access-6x8j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:18:05 crc kubenswrapper[4742]: I0317 11:18:05.209290 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x8j4\" (UniqueName: \"kubernetes.io/projected/e08745d1-ba86-4a70-a6f0-c8108edb08b7-kube-api-access-6x8j4\") on node \"crc\" DevicePath \"\"" Mar 17 11:18:05 crc kubenswrapper[4742]: I0317 11:18:05.615544 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562438-9522d" event={"ID":"e08745d1-ba86-4a70-a6f0-c8108edb08b7","Type":"ContainerDied","Data":"88a269b3beb64c45fa5dbb5b1e71a9ed1228439731fc04f287a5c17338fbcd8f"} Mar 17 11:18:05 crc kubenswrapper[4742]: I0317 11:18:05.615603 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88a269b3beb64c45fa5dbb5b1e71a9ed1228439731fc04f287a5c17338fbcd8f" Mar 17 11:18:05 crc kubenswrapper[4742]: I0317 11:18:05.615679 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562438-9522d" Mar 17 11:18:19 crc kubenswrapper[4742]: I0317 11:18:19.707347 4742 generic.go:334] "Generic (PLEG): container finished" podID="f33a63f1-688a-46eb-a32f-5259fa969528" containerID="6bd100bf1cb3a9dd55bab92415f41ddfff93a7f78e8b9d3b6325876762e27138" exitCode=0 Mar 17 11:18:19 crc kubenswrapper[4742]: I0317 11:18:19.707479 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" event={"ID":"f33a63f1-688a-46eb-a32f-5259fa969528","Type":"ContainerDied","Data":"6bd100bf1cb3a9dd55bab92415f41ddfff93a7f78e8b9d3b6325876762e27138"} Mar 17 11:18:19 crc kubenswrapper[4742]: I0317 11:18:19.708568 4742 scope.go:117] "RemoveContainer" containerID="6bd100bf1cb3a9dd55bab92415f41ddfff93a7f78e8b9d3b6325876762e27138" Mar 17 11:18:20 crc kubenswrapper[4742]: I0317 11:18:20.717606 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" event={"ID":"f33a63f1-688a-46eb-a32f-5259fa969528","Type":"ContainerStarted","Data":"07209bafc5652475336d6433879d25838a57ab299ba848584623007f65a38619"} Mar 17 11:18:20 crc kubenswrapper[4742]: I0317 11:18:20.718865 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:18:20 crc kubenswrapper[4742]: I0317 11:18:20.721341 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:18:48 crc kubenswrapper[4742]: I0317 11:18:48.043818 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:18:48 crc kubenswrapper[4742]: I0317 11:18:48.044471 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:18:57 crc kubenswrapper[4742]: I0317 11:18:57.858159 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-trcpk"] Mar 17 11:18:57 crc kubenswrapper[4742]: E0317 11:18:57.858892 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08745d1-ba86-4a70-a6f0-c8108edb08b7" containerName="oc" Mar 17 11:18:57 crc kubenswrapper[4742]: I0317 11:18:57.858935 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08745d1-ba86-4a70-a6f0-c8108edb08b7" containerName="oc" Mar 17 11:18:57 crc kubenswrapper[4742]: I0317 11:18:57.859070 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08745d1-ba86-4a70-a6f0-c8108edb08b7" containerName="oc" Mar 17 11:18:57 crc kubenswrapper[4742]: I0317 11:18:57.859655 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:57 crc kubenswrapper[4742]: I0317 11:18:57.886358 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-trcpk"] Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.039872 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-registry-tls\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.039930 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-bound-sa-token\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.039982 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.040015 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.040149 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-registry-certificates\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.040216 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-trusted-ca\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.040350 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4bt5\" (UniqueName: \"kubernetes.io/projected/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-kube-api-access-w4bt5\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.040407 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.062862 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.142655 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.142773 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-registry-certificates\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.142815 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-trusted-ca\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.142971 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4bt5\" (UniqueName: \"kubernetes.io/projected/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-kube-api-access-w4bt5\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.143050 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.143093 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-registry-tls\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.143148 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-bound-sa-token\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.143747 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.144119 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-registry-certificates\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.144510 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-trusted-ca\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.148953 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.149866 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-registry-tls\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.169275 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-bound-sa-token\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.171849 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4bt5\" (UniqueName: \"kubernetes.io/projected/7c65cb36-35d9-4ae5-a9a5-2a3c0383151b-kube-api-access-w4bt5\") pod \"image-registry-66df7c8f76-trcpk\" (UID: \"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b\") " pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.180942 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.610592 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-trcpk"] Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.960746 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" event={"ID":"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b","Type":"ContainerStarted","Data":"f4d71cc63def344c80185f946bb164f9397693fd059174c3d95d214e45681870"} Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.960795 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" event={"ID":"7c65cb36-35d9-4ae5-a9a5-2a3c0383151b","Type":"ContainerStarted","Data":"a5091f6126a2e8cd7b6a0eba7cf8e13d8a1a02e8021f97f97256d42d0410a5b8"} Mar 17 11:18:58 crc kubenswrapper[4742]: I0317 11:18:58.962366 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:18:59 crc kubenswrapper[4742]: I0317 11:18:59.060139 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" podStartSLOduration=2.060123612 podStartE2EDuration="2.060123612s" podCreationTimestamp="2026-03-17 11:18:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:18:59.057056184 +0000 UTC m=+442.183183982" watchObservedRunningTime="2026-03-17 11:18:59.060123612 +0000 UTC m=+442.186251370" Mar 17 11:19:18 crc kubenswrapper[4742]: I0317 11:19:18.044006 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:19:18 crc kubenswrapper[4742]: I0317 11:19:18.044729 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:19:18 crc kubenswrapper[4742]: I0317 11:19:18.192537 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-trcpk" Mar 17 11:19:18 crc kubenswrapper[4742]: I0317 11:19:18.254685 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9lz9n"] Mar 17 11:19:30 crc kubenswrapper[4742]: I0317 11:19:30.685185 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:19:30 crc kubenswrapper[4742]: I0317 11:19:30.685872 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:19:30 crc kubenswrapper[4742]: I0317 11:19:30.687863 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:19:30 crc kubenswrapper[4742]: I0317 11:19:30.698876 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:19:30 crc kubenswrapper[4742]: I0317 11:19:30.788675 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:19:30 crc kubenswrapper[4742]: I0317 11:19:30.788761 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:19:30 crc kubenswrapper[4742]: I0317 11:19:30.793713 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:19:30 crc kubenswrapper[4742]: I0317 11:19:30.796400 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:19:30 crc kubenswrapper[4742]: I0317 11:19:30.863704 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 17 11:19:30 crc kubenswrapper[4742]: I0317 11:19:30.863846 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:19:30 crc kubenswrapper[4742]: I0317 11:19:30.863704 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 17 11:19:31 crc kubenswrapper[4742]: W0317 11:19:31.357174 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-4612b9b875b103559bcb8a2dcb53ddb38bb8af1d6f8610eec37a38d9056f1b52 WatchSource:0}: Error finding container 4612b9b875b103559bcb8a2dcb53ddb38bb8af1d6f8610eec37a38d9056f1b52: Status 404 returned error can't find the container with id 4612b9b875b103559bcb8a2dcb53ddb38bb8af1d6f8610eec37a38d9056f1b52 Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.438215 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0d10b785ce9a489c37c04259f0a1e13508e2652e650a9220df6075373faee780"} Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.451890 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4612b9b875b103559bcb8a2dcb53ddb38bb8af1d6f8610eec37a38d9056f1b52"} Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.453175 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c997d1ec36e86e4fa051fe43916914ca7374aed087689a687b00cdd8fbac930f"} Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.608614 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cnzgk"] Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.609242 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cnzgk" podUID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" containerName="registry-server" containerID="cri-o://69bfcbe9d9a000afa411d70603f541dc847747a1f3a53f3401ad9723ce234031" gracePeriod=30 Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.620988 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5v4hw"] Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.621344 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5v4hw" podUID="72e6f877-4431-46ba-8c22-0479a383851b" containerName="registry-server" containerID="cri-o://bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963" gracePeriod=30 Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.624389 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6n4cr"] Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.624594 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" podUID="f33a63f1-688a-46eb-a32f-5259fa969528" containerName="marketplace-operator" containerID="cri-o://07209bafc5652475336d6433879d25838a57ab299ba848584623007f65a38619" gracePeriod=30 Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.634273 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rxctp"] Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.634944 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.642851 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p27vh"] Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.645586 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p27vh" podUID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" containerName="registry-server" containerID="cri-o://17a2045c0843315f80387afe38ad29abf88b8c06659c4014ee2a76740605e7c6" gracePeriod=30 Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.647649 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rxctp"] Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.660966 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-47pqd"] Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.661179 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-47pqd" podUID="24946b1f-6d3e-457c-b78f-213f94b2b650" containerName="registry-server" containerID="cri-o://00d7d9ffa3c91c1b26a7a94c4b95be9c369685a79036814e469d8c0dec03d7e0" gracePeriod=30 Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.802702 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn7t5\" (UniqueName: \"kubernetes.io/projected/66e4c4dd-b0fe-4877-8520-bdbd18b096d4-kube-api-access-jn7t5\") pod \"marketplace-operator-79b997595-rxctp\" (UID: \"66e4c4dd-b0fe-4877-8520-bdbd18b096d4\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.802812 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e4c4dd-b0fe-4877-8520-bdbd18b096d4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rxctp\" (UID: \"66e4c4dd-b0fe-4877-8520-bdbd18b096d4\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.802841 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66e4c4dd-b0fe-4877-8520-bdbd18b096d4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rxctp\" (UID: \"66e4c4dd-b0fe-4877-8520-bdbd18b096d4\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.904520 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn7t5\" (UniqueName: \"kubernetes.io/projected/66e4c4dd-b0fe-4877-8520-bdbd18b096d4-kube-api-access-jn7t5\") pod \"marketplace-operator-79b997595-rxctp\" (UID: \"66e4c4dd-b0fe-4877-8520-bdbd18b096d4\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.904936 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e4c4dd-b0fe-4877-8520-bdbd18b096d4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rxctp\" (UID: \"66e4c4dd-b0fe-4877-8520-bdbd18b096d4\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.904966 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66e4c4dd-b0fe-4877-8520-bdbd18b096d4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rxctp\" (UID: \"66e4c4dd-b0fe-4877-8520-bdbd18b096d4\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.906329 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/66e4c4dd-b0fe-4877-8520-bdbd18b096d4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rxctp\" (UID: \"66e4c4dd-b0fe-4877-8520-bdbd18b096d4\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.910106 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/66e4c4dd-b0fe-4877-8520-bdbd18b096d4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rxctp\" (UID: \"66e4c4dd-b0fe-4877-8520-bdbd18b096d4\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.920429 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn7t5\" (UniqueName: \"kubernetes.io/projected/66e4c4dd-b0fe-4877-8520-bdbd18b096d4-kube-api-access-jn7t5\") pod \"marketplace-operator-79b997595-rxctp\" (UID: \"66e4c4dd-b0fe-4877-8520-bdbd18b096d4\") " pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" Mar 17 11:19:31 crc kubenswrapper[4742]: I0317 11:19:31.953738 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" Mar 17 11:19:31 crc kubenswrapper[4742]: E0317 11:19:31.958470 4742 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963 is running failed: container process not found" containerID="bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963" cmd=["grpc_health_probe","-addr=:50051"] Mar 17 11:19:31 crc kubenswrapper[4742]: E0317 11:19:31.959024 4742 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963 is running failed: container process not found" containerID="bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963" cmd=["grpc_health_probe","-addr=:50051"] Mar 17 11:19:31 crc kubenswrapper[4742]: E0317 11:19:31.959393 4742 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963 is running failed: container process not found" containerID="bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963" cmd=["grpc_health_probe","-addr=:50051"] Mar 17 11:19:31 crc kubenswrapper[4742]: E0317 11:19:31.959424 4742 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-5v4hw" podUID="72e6f877-4431-46ba-8c22-0479a383851b" containerName="registry-server" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.031650 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.082818 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.092349 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.094300 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.125073 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.213831 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-utilities\") pod \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\" (UID: \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.213872 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-catalog-content\") pod \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\" (UID: \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.213901 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkglr\" (UniqueName: \"kubernetes.io/projected/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-kube-api-access-tkglr\") pod \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\" (UID: \"ce3a51df-d6e4-46d5-95e3-8be6aaba196f\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.213943 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w5nm\" (UniqueName: \"kubernetes.io/projected/4b80c435-0e24-4ab2-980c-f2dfb1baef87-kube-api-access-7w5nm\") pod \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\" (UID: \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.213964 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e6f877-4431-46ba-8c22-0479a383851b-utilities\") pod \"72e6f877-4431-46ba-8c22-0479a383851b\" (UID: \"72e6f877-4431-46ba-8c22-0479a383851b\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.213983 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24946b1f-6d3e-457c-b78f-213f94b2b650-utilities\") pod \"24946b1f-6d3e-457c-b78f-213f94b2b650\" (UID: \"24946b1f-6d3e-457c-b78f-213f94b2b650\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.214017 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24946b1f-6d3e-457c-b78f-213f94b2b650-catalog-content\") pod \"24946b1f-6d3e-457c-b78f-213f94b2b650\" (UID: \"24946b1f-6d3e-457c-b78f-213f94b2b650\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.214046 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbjf9\" (UniqueName: \"kubernetes.io/projected/24946b1f-6d3e-457c-b78f-213f94b2b650-kube-api-access-wbjf9\") pod \"24946b1f-6d3e-457c-b78f-213f94b2b650\" (UID: \"24946b1f-6d3e-457c-b78f-213f94b2b650\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.214075 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e6f877-4431-46ba-8c22-0479a383851b-catalog-content\") pod \"72e6f877-4431-46ba-8c22-0479a383851b\" (UID: \"72e6f877-4431-46ba-8c22-0479a383851b\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.214101 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5sxs\" (UniqueName: \"kubernetes.io/projected/72e6f877-4431-46ba-8c22-0479a383851b-kube-api-access-w5sxs\") pod \"72e6f877-4431-46ba-8c22-0479a383851b\" (UID: \"72e6f877-4431-46ba-8c22-0479a383851b\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.214120 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b80c435-0e24-4ab2-980c-f2dfb1baef87-utilities\") pod \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\" (UID: \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.214177 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b80c435-0e24-4ab2-980c-f2dfb1baef87-catalog-content\") pod \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\" (UID: \"4b80c435-0e24-4ab2-980c-f2dfb1baef87\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.214971 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24946b1f-6d3e-457c-b78f-213f94b2b650-utilities" (OuterVolumeSpecName: "utilities") pod "24946b1f-6d3e-457c-b78f-213f94b2b650" (UID: "24946b1f-6d3e-457c-b78f-213f94b2b650"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.215541 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-utilities" (OuterVolumeSpecName: "utilities") pod "ce3a51df-d6e4-46d5-95e3-8be6aaba196f" (UID: "ce3a51df-d6e4-46d5-95e3-8be6aaba196f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.223738 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b80c435-0e24-4ab2-980c-f2dfb1baef87-utilities" (OuterVolumeSpecName: "utilities") pod "4b80c435-0e24-4ab2-980c-f2dfb1baef87" (UID: "4b80c435-0e24-4ab2-980c-f2dfb1baef87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.224010 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72e6f877-4431-46ba-8c22-0479a383851b-utilities" (OuterVolumeSpecName: "utilities") pod "72e6f877-4431-46ba-8c22-0479a383851b" (UID: "72e6f877-4431-46ba-8c22-0479a383851b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.227487 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72e6f877-4431-46ba-8c22-0479a383851b-kube-api-access-w5sxs" (OuterVolumeSpecName: "kube-api-access-w5sxs") pod "72e6f877-4431-46ba-8c22-0479a383851b" (UID: "72e6f877-4431-46ba-8c22-0479a383851b"). InnerVolumeSpecName "kube-api-access-w5sxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.227657 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24946b1f-6d3e-457c-b78f-213f94b2b650-kube-api-access-wbjf9" (OuterVolumeSpecName: "kube-api-access-wbjf9") pod "24946b1f-6d3e-457c-b78f-213f94b2b650" (UID: "24946b1f-6d3e-457c-b78f-213f94b2b650"). InnerVolumeSpecName "kube-api-access-wbjf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.227563 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-kube-api-access-tkglr" (OuterVolumeSpecName: "kube-api-access-tkglr") pod "ce3a51df-d6e4-46d5-95e3-8be6aaba196f" (UID: "ce3a51df-d6e4-46d5-95e3-8be6aaba196f"). InnerVolumeSpecName "kube-api-access-tkglr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.246532 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b80c435-0e24-4ab2-980c-f2dfb1baef87-kube-api-access-7w5nm" (OuterVolumeSpecName: "kube-api-access-7w5nm") pod "4b80c435-0e24-4ab2-980c-f2dfb1baef87" (UID: "4b80c435-0e24-4ab2-980c-f2dfb1baef87"). InnerVolumeSpecName "kube-api-access-7w5nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.269896 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce3a51df-d6e4-46d5-95e3-8be6aaba196f" (UID: "ce3a51df-d6e4-46d5-95e3-8be6aaba196f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.285968 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b80c435-0e24-4ab2-980c-f2dfb1baef87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b80c435-0e24-4ab2-980c-f2dfb1baef87" (UID: "4b80c435-0e24-4ab2-980c-f2dfb1baef87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.295477 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72e6f877-4431-46ba-8c22-0479a383851b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72e6f877-4431-46ba-8c22-0479a383851b" (UID: "72e6f877-4431-46ba-8c22-0479a383851b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.314712 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f33a63f1-688a-46eb-a32f-5259fa969528-marketplace-trusted-ca\") pod \"f33a63f1-688a-46eb-a32f-5259fa969528\" (UID: \"f33a63f1-688a-46eb-a32f-5259fa969528\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.314845 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f33a63f1-688a-46eb-a32f-5259fa969528-marketplace-operator-metrics\") pod \"f33a63f1-688a-46eb-a32f-5259fa969528\" (UID: \"f33a63f1-688a-46eb-a32f-5259fa969528\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.314869 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7lcv\" (UniqueName: \"kubernetes.io/projected/f33a63f1-688a-46eb-a32f-5259fa969528-kube-api-access-v7lcv\") pod \"f33a63f1-688a-46eb-a32f-5259fa969528\" (UID: \"f33a63f1-688a-46eb-a32f-5259fa969528\") " Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.315079 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.315099 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.315110 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkglr\" (UniqueName: \"kubernetes.io/projected/ce3a51df-d6e4-46d5-95e3-8be6aaba196f-kube-api-access-tkglr\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.315119 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w5nm\" (UniqueName: \"kubernetes.io/projected/4b80c435-0e24-4ab2-980c-f2dfb1baef87-kube-api-access-7w5nm\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.315129 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e6f877-4431-46ba-8c22-0479a383851b-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.315136 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24946b1f-6d3e-457c-b78f-213f94b2b650-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.315144 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbjf9\" (UniqueName: \"kubernetes.io/projected/24946b1f-6d3e-457c-b78f-213f94b2b650-kube-api-access-wbjf9\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.315152 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e6f877-4431-46ba-8c22-0479a383851b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.315160 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5sxs\" (UniqueName: \"kubernetes.io/projected/72e6f877-4431-46ba-8c22-0479a383851b-kube-api-access-w5sxs\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.315168 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b80c435-0e24-4ab2-980c-f2dfb1baef87-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.315175 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b80c435-0e24-4ab2-980c-f2dfb1baef87-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.316054 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f33a63f1-688a-46eb-a32f-5259fa969528-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "f33a63f1-688a-46eb-a32f-5259fa969528" (UID: "f33a63f1-688a-46eb-a32f-5259fa969528"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.318672 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f33a63f1-688a-46eb-a32f-5259fa969528-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "f33a63f1-688a-46eb-a32f-5259fa969528" (UID: "f33a63f1-688a-46eb-a32f-5259fa969528"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.318773 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f33a63f1-688a-46eb-a32f-5259fa969528-kube-api-access-v7lcv" (OuterVolumeSpecName: "kube-api-access-v7lcv") pod "f33a63f1-688a-46eb-a32f-5259fa969528" (UID: "f33a63f1-688a-46eb-a32f-5259fa969528"). InnerVolumeSpecName "kube-api-access-v7lcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.396852 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rxctp"] Mar 17 11:19:32 crc kubenswrapper[4742]: W0317 11:19:32.398790 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66e4c4dd_b0fe_4877_8520_bdbd18b096d4.slice/crio-dd9743794d71b034433a45425163939e587ad29784fd07a045b990c0b06333af WatchSource:0}: Error finding container dd9743794d71b034433a45425163939e587ad29784fd07a045b990c0b06333af: Status 404 returned error can't find the container with id dd9743794d71b034433a45425163939e587ad29784fd07a045b990c0b06333af Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.401030 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24946b1f-6d3e-457c-b78f-213f94b2b650-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24946b1f-6d3e-457c-b78f-213f94b2b650" (UID: "24946b1f-6d3e-457c-b78f-213f94b2b650"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.416190 4742 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f33a63f1-688a-46eb-a32f-5259fa969528-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.416238 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7lcv\" (UniqueName: \"kubernetes.io/projected/f33a63f1-688a-46eb-a32f-5259fa969528-kube-api-access-v7lcv\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.416255 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24946b1f-6d3e-457c-b78f-213f94b2b650-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.416266 4742 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f33a63f1-688a-46eb-a32f-5259fa969528-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.460833 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"593677bf0bd3f58afb5b694764db63298e2424873177cd9be13fa5d2bbfd42a9"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.464154 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"305e161b23ce28c26e6882876b4bb6e5cd0bfe47a718a210e64df282bddbf203"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.464894 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.465880 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ed1b6a54da54db5c1a34dd8949a9cdad4cc147d2b75133866ca5e08a39c6a9cc"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.468060 4742 generic.go:334] "Generic (PLEG): container finished" podID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" containerID="69bfcbe9d9a000afa411d70603f541dc847747a1f3a53f3401ad9723ce234031" exitCode=0 Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.468099 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cnzgk" event={"ID":"4b80c435-0e24-4ab2-980c-f2dfb1baef87","Type":"ContainerDied","Data":"69bfcbe9d9a000afa411d70603f541dc847747a1f3a53f3401ad9723ce234031"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.468117 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cnzgk" event={"ID":"4b80c435-0e24-4ab2-980c-f2dfb1baef87","Type":"ContainerDied","Data":"67fa5be08079fe2895da8132f995273c1c25e0ee686c429150d425a61411791f"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.468133 4742 scope.go:117] "RemoveContainer" containerID="69bfcbe9d9a000afa411d70603f541dc847747a1f3a53f3401ad9723ce234031" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.468309 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cnzgk" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.473112 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" event={"ID":"66e4c4dd-b0fe-4877-8520-bdbd18b096d4","Type":"ContainerStarted","Data":"dd9743794d71b034433a45425163939e587ad29784fd07a045b990c0b06333af"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.475994 4742 generic.go:334] "Generic (PLEG): container finished" podID="24946b1f-6d3e-457c-b78f-213f94b2b650" containerID="00d7d9ffa3c91c1b26a7a94c4b95be9c369685a79036814e469d8c0dec03d7e0" exitCode=0 Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.476056 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47pqd" event={"ID":"24946b1f-6d3e-457c-b78f-213f94b2b650","Type":"ContainerDied","Data":"00d7d9ffa3c91c1b26a7a94c4b95be9c369685a79036814e469d8c0dec03d7e0"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.476074 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-47pqd" event={"ID":"24946b1f-6d3e-457c-b78f-213f94b2b650","Type":"ContainerDied","Data":"21f2d5a23f955d199378a6a91571fa9740e60d4046211aed7df111b8fc86ee3a"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.476178 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-47pqd" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.489946 4742 generic.go:334] "Generic (PLEG): container finished" podID="f33a63f1-688a-46eb-a32f-5259fa969528" containerID="07209bafc5652475336d6433879d25838a57ab299ba848584623007f65a38619" exitCode=0 Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.490016 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.490114 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" event={"ID":"f33a63f1-688a-46eb-a32f-5259fa969528","Type":"ContainerDied","Data":"07209bafc5652475336d6433879d25838a57ab299ba848584623007f65a38619"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.490157 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6n4cr" event={"ID":"f33a63f1-688a-46eb-a32f-5259fa969528","Type":"ContainerDied","Data":"a0cfa0b1e3062ca7ba5b72dc6b75c65c2f3f3b476a1faba51de5910fe5691f5c"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.492336 4742 generic.go:334] "Generic (PLEG): container finished" podID="72e6f877-4431-46ba-8c22-0479a383851b" containerID="bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963" exitCode=0 Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.492397 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v4hw" event={"ID":"72e6f877-4431-46ba-8c22-0479a383851b","Type":"ContainerDied","Data":"bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.492414 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5v4hw" event={"ID":"72e6f877-4431-46ba-8c22-0479a383851b","Type":"ContainerDied","Data":"1d4b2caa7f8e1ad04201a863cab02f5690154c0e24d3fea9a1769cd7c06a56ce"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.492492 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5v4hw" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.495023 4742 generic.go:334] "Generic (PLEG): container finished" podID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" containerID="17a2045c0843315f80387afe38ad29abf88b8c06659c4014ee2a76740605e7c6" exitCode=0 Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.495066 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p27vh" event={"ID":"ce3a51df-d6e4-46d5-95e3-8be6aaba196f","Type":"ContainerDied","Data":"17a2045c0843315f80387afe38ad29abf88b8c06659c4014ee2a76740605e7c6"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.495089 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p27vh" event={"ID":"ce3a51df-d6e4-46d5-95e3-8be6aaba196f","Type":"ContainerDied","Data":"970e14da973914f33acd8fc38cc0abd459e88f062e298899a6e3945894a002c7"} Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.495146 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p27vh" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.511017 4742 scope.go:117] "RemoveContainer" containerID="b82748e7724c6322f05fe1c0e570dbc35567a479e20975a5a9cfed7d12958648" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.548933 4742 scope.go:117] "RemoveContainer" containerID="04ed2a1da8652b1cd0b35faff8375071651dedb4b2ab99e74e891f2f33e13e58" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.549043 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cnzgk"] Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.561789 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cnzgk"] Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.569602 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5v4hw"] Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.570013 4742 scope.go:117] "RemoveContainer" containerID="69bfcbe9d9a000afa411d70603f541dc847747a1f3a53f3401ad9723ce234031" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.570866 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69bfcbe9d9a000afa411d70603f541dc847747a1f3a53f3401ad9723ce234031\": container with ID starting with 69bfcbe9d9a000afa411d70603f541dc847747a1f3a53f3401ad9723ce234031 not found: ID does not exist" containerID="69bfcbe9d9a000afa411d70603f541dc847747a1f3a53f3401ad9723ce234031" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.570917 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69bfcbe9d9a000afa411d70603f541dc847747a1f3a53f3401ad9723ce234031"} err="failed to get container status \"69bfcbe9d9a000afa411d70603f541dc847747a1f3a53f3401ad9723ce234031\": rpc error: code = NotFound desc = could not find container \"69bfcbe9d9a000afa411d70603f541dc847747a1f3a53f3401ad9723ce234031\": container with ID starting with 69bfcbe9d9a000afa411d70603f541dc847747a1f3a53f3401ad9723ce234031 not found: ID does not exist" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.570946 4742 scope.go:117] "RemoveContainer" containerID="b82748e7724c6322f05fe1c0e570dbc35567a479e20975a5a9cfed7d12958648" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.572224 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b82748e7724c6322f05fe1c0e570dbc35567a479e20975a5a9cfed7d12958648\": container with ID starting with b82748e7724c6322f05fe1c0e570dbc35567a479e20975a5a9cfed7d12958648 not found: ID does not exist" containerID="b82748e7724c6322f05fe1c0e570dbc35567a479e20975a5a9cfed7d12958648" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.572252 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b82748e7724c6322f05fe1c0e570dbc35567a479e20975a5a9cfed7d12958648"} err="failed to get container status \"b82748e7724c6322f05fe1c0e570dbc35567a479e20975a5a9cfed7d12958648\": rpc error: code = NotFound desc = could not find container \"b82748e7724c6322f05fe1c0e570dbc35567a479e20975a5a9cfed7d12958648\": container with ID starting with b82748e7724c6322f05fe1c0e570dbc35567a479e20975a5a9cfed7d12958648 not found: ID does not exist" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.572269 4742 scope.go:117] "RemoveContainer" containerID="04ed2a1da8652b1cd0b35faff8375071651dedb4b2ab99e74e891f2f33e13e58" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.572603 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ed2a1da8652b1cd0b35faff8375071651dedb4b2ab99e74e891f2f33e13e58\": container with ID starting with 04ed2a1da8652b1cd0b35faff8375071651dedb4b2ab99e74e891f2f33e13e58 not found: ID does not exist" containerID="04ed2a1da8652b1cd0b35faff8375071651dedb4b2ab99e74e891f2f33e13e58" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.572699 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ed2a1da8652b1cd0b35faff8375071651dedb4b2ab99e74e891f2f33e13e58"} err="failed to get container status \"04ed2a1da8652b1cd0b35faff8375071651dedb4b2ab99e74e891f2f33e13e58\": rpc error: code = NotFound desc = could not find container \"04ed2a1da8652b1cd0b35faff8375071651dedb4b2ab99e74e891f2f33e13e58\": container with ID starting with 04ed2a1da8652b1cd0b35faff8375071651dedb4b2ab99e74e891f2f33e13e58 not found: ID does not exist" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.572726 4742 scope.go:117] "RemoveContainer" containerID="00d7d9ffa3c91c1b26a7a94c4b95be9c369685a79036814e469d8c0dec03d7e0" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.579654 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5v4hw"] Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.583550 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-47pqd"] Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.592709 4742 scope.go:117] "RemoveContainer" containerID="731b497c91bb2f0675aa335ccb0442569123b028ddcba64c5f9f185f223355bf" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.597106 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-47pqd"] Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.612812 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6n4cr"] Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.621221 4742 scope.go:117] "RemoveContainer" containerID="bd145430c6009412a8ca4df21318b658485841e313380d1acbef3f19805746b3" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.623265 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6n4cr"] Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.630124 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p27vh"] Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.639516 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p27vh"] Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.639975 4742 scope.go:117] "RemoveContainer" containerID="00d7d9ffa3c91c1b26a7a94c4b95be9c369685a79036814e469d8c0dec03d7e0" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.640408 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00d7d9ffa3c91c1b26a7a94c4b95be9c369685a79036814e469d8c0dec03d7e0\": container with ID starting with 00d7d9ffa3c91c1b26a7a94c4b95be9c369685a79036814e469d8c0dec03d7e0 not found: ID does not exist" containerID="00d7d9ffa3c91c1b26a7a94c4b95be9c369685a79036814e469d8c0dec03d7e0" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.640472 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00d7d9ffa3c91c1b26a7a94c4b95be9c369685a79036814e469d8c0dec03d7e0"} err="failed to get container status \"00d7d9ffa3c91c1b26a7a94c4b95be9c369685a79036814e469d8c0dec03d7e0\": rpc error: code = NotFound desc = could not find container \"00d7d9ffa3c91c1b26a7a94c4b95be9c369685a79036814e469d8c0dec03d7e0\": container with ID starting with 00d7d9ffa3c91c1b26a7a94c4b95be9c369685a79036814e469d8c0dec03d7e0 not found: ID does not exist" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.640505 4742 scope.go:117] "RemoveContainer" containerID="731b497c91bb2f0675aa335ccb0442569123b028ddcba64c5f9f185f223355bf" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.640893 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"731b497c91bb2f0675aa335ccb0442569123b028ddcba64c5f9f185f223355bf\": container with ID starting with 731b497c91bb2f0675aa335ccb0442569123b028ddcba64c5f9f185f223355bf not found: ID does not exist" containerID="731b497c91bb2f0675aa335ccb0442569123b028ddcba64c5f9f185f223355bf" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.640954 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"731b497c91bb2f0675aa335ccb0442569123b028ddcba64c5f9f185f223355bf"} err="failed to get container status \"731b497c91bb2f0675aa335ccb0442569123b028ddcba64c5f9f185f223355bf\": rpc error: code = NotFound desc = could not find container \"731b497c91bb2f0675aa335ccb0442569123b028ddcba64c5f9f185f223355bf\": container with ID starting with 731b497c91bb2f0675aa335ccb0442569123b028ddcba64c5f9f185f223355bf not found: ID does not exist" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.640982 4742 scope.go:117] "RemoveContainer" containerID="bd145430c6009412a8ca4df21318b658485841e313380d1acbef3f19805746b3" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.641270 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd145430c6009412a8ca4df21318b658485841e313380d1acbef3f19805746b3\": container with ID starting with bd145430c6009412a8ca4df21318b658485841e313380d1acbef3f19805746b3 not found: ID does not exist" containerID="bd145430c6009412a8ca4df21318b658485841e313380d1acbef3f19805746b3" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.641308 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd145430c6009412a8ca4df21318b658485841e313380d1acbef3f19805746b3"} err="failed to get container status \"bd145430c6009412a8ca4df21318b658485841e313380d1acbef3f19805746b3\": rpc error: code = NotFound desc = could not find container \"bd145430c6009412a8ca4df21318b658485841e313380d1acbef3f19805746b3\": container with ID starting with bd145430c6009412a8ca4df21318b658485841e313380d1acbef3f19805746b3 not found: ID does not exist" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.641336 4742 scope.go:117] "RemoveContainer" containerID="07209bafc5652475336d6433879d25838a57ab299ba848584623007f65a38619" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.653781 4742 scope.go:117] "RemoveContainer" containerID="6bd100bf1cb3a9dd55bab92415f41ddfff93a7f78e8b9d3b6325876762e27138" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.676218 4742 scope.go:117] "RemoveContainer" containerID="07209bafc5652475336d6433879d25838a57ab299ba848584623007f65a38619" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.678201 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07209bafc5652475336d6433879d25838a57ab299ba848584623007f65a38619\": container with ID starting with 07209bafc5652475336d6433879d25838a57ab299ba848584623007f65a38619 not found: ID does not exist" containerID="07209bafc5652475336d6433879d25838a57ab299ba848584623007f65a38619" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.678245 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07209bafc5652475336d6433879d25838a57ab299ba848584623007f65a38619"} err="failed to get container status \"07209bafc5652475336d6433879d25838a57ab299ba848584623007f65a38619\": rpc error: code = NotFound desc = could not find container \"07209bafc5652475336d6433879d25838a57ab299ba848584623007f65a38619\": container with ID starting with 07209bafc5652475336d6433879d25838a57ab299ba848584623007f65a38619 not found: ID does not exist" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.678276 4742 scope.go:117] "RemoveContainer" containerID="6bd100bf1cb3a9dd55bab92415f41ddfff93a7f78e8b9d3b6325876762e27138" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.678777 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24946b1f-6d3e-457c-b78f-213f94b2b650" path="/var/lib/kubelet/pods/24946b1f-6d3e-457c-b78f-213f94b2b650/volumes" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.678777 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bd100bf1cb3a9dd55bab92415f41ddfff93a7f78e8b9d3b6325876762e27138\": container with ID starting with 6bd100bf1cb3a9dd55bab92415f41ddfff93a7f78e8b9d3b6325876762e27138 not found: ID does not exist" containerID="6bd100bf1cb3a9dd55bab92415f41ddfff93a7f78e8b9d3b6325876762e27138" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.678853 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd100bf1cb3a9dd55bab92415f41ddfff93a7f78e8b9d3b6325876762e27138"} err="failed to get container status \"6bd100bf1cb3a9dd55bab92415f41ddfff93a7f78e8b9d3b6325876762e27138\": rpc error: code = NotFound desc = could not find container \"6bd100bf1cb3a9dd55bab92415f41ddfff93a7f78e8b9d3b6325876762e27138\": container with ID starting with 6bd100bf1cb3a9dd55bab92415f41ddfff93a7f78e8b9d3b6325876762e27138 not found: ID does not exist" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.678879 4742 scope.go:117] "RemoveContainer" containerID="bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.679384 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" path="/var/lib/kubelet/pods/4b80c435-0e24-4ab2-980c-f2dfb1baef87/volumes" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.679926 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72e6f877-4431-46ba-8c22-0479a383851b" path="/var/lib/kubelet/pods/72e6f877-4431-46ba-8c22-0479a383851b/volumes" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.680861 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" path="/var/lib/kubelet/pods/ce3a51df-d6e4-46d5-95e3-8be6aaba196f/volumes" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.681433 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f33a63f1-688a-46eb-a32f-5259fa969528" path="/var/lib/kubelet/pods/f33a63f1-688a-46eb-a32f-5259fa969528/volumes" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.693749 4742 scope.go:117] "RemoveContainer" containerID="ab31dfd2527d00b4290f7c0d2193daf9625e62861189e20d7225985e0570976c" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.706929 4742 scope.go:117] "RemoveContainer" containerID="5e8f9625086cd06185e0d36a1c5105d424adb3367bcaefc6ec9d6c22f6d06951" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.721933 4742 scope.go:117] "RemoveContainer" containerID="bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.722950 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963\": container with ID starting with bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963 not found: ID does not exist" containerID="bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.722974 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963"} err="failed to get container status \"bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963\": rpc error: code = NotFound desc = could not find container \"bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963\": container with ID starting with bd620d80e5426507c20a1a6c37abf761b8d5d0c51890e49ee61b23ff66f69963 not found: ID does not exist" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.723004 4742 scope.go:117] "RemoveContainer" containerID="ab31dfd2527d00b4290f7c0d2193daf9625e62861189e20d7225985e0570976c" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.723349 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab31dfd2527d00b4290f7c0d2193daf9625e62861189e20d7225985e0570976c\": container with ID starting with ab31dfd2527d00b4290f7c0d2193daf9625e62861189e20d7225985e0570976c not found: ID does not exist" containerID="ab31dfd2527d00b4290f7c0d2193daf9625e62861189e20d7225985e0570976c" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.723400 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab31dfd2527d00b4290f7c0d2193daf9625e62861189e20d7225985e0570976c"} err="failed to get container status \"ab31dfd2527d00b4290f7c0d2193daf9625e62861189e20d7225985e0570976c\": rpc error: code = NotFound desc = could not find container \"ab31dfd2527d00b4290f7c0d2193daf9625e62861189e20d7225985e0570976c\": container with ID starting with ab31dfd2527d00b4290f7c0d2193daf9625e62861189e20d7225985e0570976c not found: ID does not exist" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.723438 4742 scope.go:117] "RemoveContainer" containerID="5e8f9625086cd06185e0d36a1c5105d424adb3367bcaefc6ec9d6c22f6d06951" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.723816 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8f9625086cd06185e0d36a1c5105d424adb3367bcaefc6ec9d6c22f6d06951\": container with ID starting with 5e8f9625086cd06185e0d36a1c5105d424adb3367bcaefc6ec9d6c22f6d06951 not found: ID does not exist" containerID="5e8f9625086cd06185e0d36a1c5105d424adb3367bcaefc6ec9d6c22f6d06951" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.723847 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8f9625086cd06185e0d36a1c5105d424adb3367bcaefc6ec9d6c22f6d06951"} err="failed to get container status \"5e8f9625086cd06185e0d36a1c5105d424adb3367bcaefc6ec9d6c22f6d06951\": rpc error: code = NotFound desc = could not find container \"5e8f9625086cd06185e0d36a1c5105d424adb3367bcaefc6ec9d6c22f6d06951\": container with ID starting with 5e8f9625086cd06185e0d36a1c5105d424adb3367bcaefc6ec9d6c22f6d06951 not found: ID does not exist" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.723868 4742 scope.go:117] "RemoveContainer" containerID="17a2045c0843315f80387afe38ad29abf88b8c06659c4014ee2a76740605e7c6" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.755605 4742 scope.go:117] "RemoveContainer" containerID="c3d3f4713c444f242c6ffb4f4d76e1e7bddaece8ab74469f733e981aca89b4e9" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.817288 4742 scope.go:117] "RemoveContainer" containerID="3bbcdce06bb4648cd2f4a95914e39856ee89b537e1ba9c1dd1da9b4c348c9536" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.834354 4742 scope.go:117] "RemoveContainer" containerID="17a2045c0843315f80387afe38ad29abf88b8c06659c4014ee2a76740605e7c6" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.835190 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a2045c0843315f80387afe38ad29abf88b8c06659c4014ee2a76740605e7c6\": container with ID starting with 17a2045c0843315f80387afe38ad29abf88b8c06659c4014ee2a76740605e7c6 not found: ID does not exist" containerID="17a2045c0843315f80387afe38ad29abf88b8c06659c4014ee2a76740605e7c6" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.835226 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a2045c0843315f80387afe38ad29abf88b8c06659c4014ee2a76740605e7c6"} err="failed to get container status \"17a2045c0843315f80387afe38ad29abf88b8c06659c4014ee2a76740605e7c6\": rpc error: code = NotFound desc = could not find container \"17a2045c0843315f80387afe38ad29abf88b8c06659c4014ee2a76740605e7c6\": container with ID starting with 17a2045c0843315f80387afe38ad29abf88b8c06659c4014ee2a76740605e7c6 not found: ID does not exist" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.835262 4742 scope.go:117] "RemoveContainer" containerID="c3d3f4713c444f242c6ffb4f4d76e1e7bddaece8ab74469f733e981aca89b4e9" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.835543 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d3f4713c444f242c6ffb4f4d76e1e7bddaece8ab74469f733e981aca89b4e9\": container with ID starting with c3d3f4713c444f242c6ffb4f4d76e1e7bddaece8ab74469f733e981aca89b4e9 not found: ID does not exist" containerID="c3d3f4713c444f242c6ffb4f4d76e1e7bddaece8ab74469f733e981aca89b4e9" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.835573 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d3f4713c444f242c6ffb4f4d76e1e7bddaece8ab74469f733e981aca89b4e9"} err="failed to get container status \"c3d3f4713c444f242c6ffb4f4d76e1e7bddaece8ab74469f733e981aca89b4e9\": rpc error: code = NotFound desc = could not find container \"c3d3f4713c444f242c6ffb4f4d76e1e7bddaece8ab74469f733e981aca89b4e9\": container with ID starting with c3d3f4713c444f242c6ffb4f4d76e1e7bddaece8ab74469f733e981aca89b4e9 not found: ID does not exist" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.835588 4742 scope.go:117] "RemoveContainer" containerID="3bbcdce06bb4648cd2f4a95914e39856ee89b537e1ba9c1dd1da9b4c348c9536" Mar 17 11:19:32 crc kubenswrapper[4742]: E0317 11:19:32.835853 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bbcdce06bb4648cd2f4a95914e39856ee89b537e1ba9c1dd1da9b4c348c9536\": container with ID starting with 3bbcdce06bb4648cd2f4a95914e39856ee89b537e1ba9c1dd1da9b4c348c9536 not found: ID does not exist" containerID="3bbcdce06bb4648cd2f4a95914e39856ee89b537e1ba9c1dd1da9b4c348c9536" Mar 17 11:19:32 crc kubenswrapper[4742]: I0317 11:19:32.835892 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bbcdce06bb4648cd2f4a95914e39856ee89b537e1ba9c1dd1da9b4c348c9536"} err="failed to get container status \"3bbcdce06bb4648cd2f4a95914e39856ee89b537e1ba9c1dd1da9b4c348c9536\": rpc error: code = NotFound desc = could not find container \"3bbcdce06bb4648cd2f4a95914e39856ee89b537e1ba9c1dd1da9b4c348c9536\": container with ID starting with 3bbcdce06bb4648cd2f4a95914e39856ee89b537e1ba9c1dd1da9b4c348c9536 not found: ID does not exist" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.506447 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" event={"ID":"66e4c4dd-b0fe-4877-8520-bdbd18b096d4","Type":"ContainerStarted","Data":"44cde5057e3121c9124e0827de878800e2ea0ee59c9f6ca2402108957346b56c"} Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.506697 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.511783 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.537239 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rxctp" podStartSLOduration=2.537213619 podStartE2EDuration="2.537213619s" podCreationTimestamp="2026-03-17 11:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:19:33.530413514 +0000 UTC m=+476.656541352" watchObservedRunningTime="2026-03-17 11:19:33.537213619 +0000 UTC m=+476.663341417" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830312 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rhqsm"] Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830500 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" containerName="registry-server" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830511 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" containerName="registry-server" Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830519 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" containerName="extract-content" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830524 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" containerName="extract-content" Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830533 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33a63f1-688a-46eb-a32f-5259fa969528" containerName="marketplace-operator" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830538 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33a63f1-688a-46eb-a32f-5259fa969528" containerName="marketplace-operator" Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830547 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" containerName="extract-utilities" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830553 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" containerName="extract-utilities" Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830561 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72e6f877-4431-46ba-8c22-0479a383851b" containerName="registry-server" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830567 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e6f877-4431-46ba-8c22-0479a383851b" containerName="registry-server" Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830574 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" containerName="extract-utilities" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830580 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" containerName="extract-utilities" Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830587 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72e6f877-4431-46ba-8c22-0479a383851b" containerName="extract-content" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830592 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e6f877-4431-46ba-8c22-0479a383851b" containerName="extract-content" Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830599 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24946b1f-6d3e-457c-b78f-213f94b2b650" containerName="registry-server" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830604 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="24946b1f-6d3e-457c-b78f-213f94b2b650" containerName="registry-server" Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830614 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" containerName="extract-content" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830621 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" containerName="extract-content" Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830630 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24946b1f-6d3e-457c-b78f-213f94b2b650" containerName="extract-utilities" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830635 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="24946b1f-6d3e-457c-b78f-213f94b2b650" containerName="extract-utilities" Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830645 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" containerName="registry-server" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830651 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" containerName="registry-server" Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830659 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24946b1f-6d3e-457c-b78f-213f94b2b650" containerName="extract-content" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830666 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="24946b1f-6d3e-457c-b78f-213f94b2b650" containerName="extract-content" Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830674 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72e6f877-4431-46ba-8c22-0479a383851b" containerName="extract-utilities" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830682 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e6f877-4431-46ba-8c22-0479a383851b" containerName="extract-utilities" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830764 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f33a63f1-688a-46eb-a32f-5259fa969528" containerName="marketplace-operator" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830773 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="24946b1f-6d3e-457c-b78f-213f94b2b650" containerName="registry-server" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830781 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce3a51df-d6e4-46d5-95e3-8be6aaba196f" containerName="registry-server" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830791 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b80c435-0e24-4ab2-980c-f2dfb1baef87" containerName="registry-server" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830799 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="72e6f877-4431-46ba-8c22-0479a383851b" containerName="registry-server" Mar 17 11:19:33 crc kubenswrapper[4742]: E0317 11:19:33.830881 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33a63f1-688a-46eb-a32f-5259fa969528" containerName="marketplace-operator" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830888 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33a63f1-688a-46eb-a32f-5259fa969528" containerName="marketplace-operator" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.830998 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f33a63f1-688a-46eb-a32f-5259fa969528" containerName="marketplace-operator" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.832172 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.835305 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.836158 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgllw\" (UniqueName: \"kubernetes.io/projected/e827c1af-bb51-4f3d-bf81-708986989404-kube-api-access-sgllw\") pod \"redhat-marketplace-rhqsm\" (UID: \"e827c1af-bb51-4f3d-bf81-708986989404\") " pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.836196 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e827c1af-bb51-4f3d-bf81-708986989404-utilities\") pod \"redhat-marketplace-rhqsm\" (UID: \"e827c1af-bb51-4f3d-bf81-708986989404\") " pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.836246 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e827c1af-bb51-4f3d-bf81-708986989404-catalog-content\") pod \"redhat-marketplace-rhqsm\" (UID: \"e827c1af-bb51-4f3d-bf81-708986989404\") " pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.851702 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhqsm"] Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.937152 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgllw\" (UniqueName: \"kubernetes.io/projected/e827c1af-bb51-4f3d-bf81-708986989404-kube-api-access-sgllw\") pod \"redhat-marketplace-rhqsm\" (UID: \"e827c1af-bb51-4f3d-bf81-708986989404\") " pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.937221 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e827c1af-bb51-4f3d-bf81-708986989404-utilities\") pod \"redhat-marketplace-rhqsm\" (UID: \"e827c1af-bb51-4f3d-bf81-708986989404\") " pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.937287 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e827c1af-bb51-4f3d-bf81-708986989404-catalog-content\") pod \"redhat-marketplace-rhqsm\" (UID: \"e827c1af-bb51-4f3d-bf81-708986989404\") " pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.938118 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e827c1af-bb51-4f3d-bf81-708986989404-catalog-content\") pod \"redhat-marketplace-rhqsm\" (UID: \"e827c1af-bb51-4f3d-bf81-708986989404\") " pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.939145 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e827c1af-bb51-4f3d-bf81-708986989404-utilities\") pod \"redhat-marketplace-rhqsm\" (UID: \"e827c1af-bb51-4f3d-bf81-708986989404\") " pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:33 crc kubenswrapper[4742]: I0317 11:19:33.954101 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgllw\" (UniqueName: \"kubernetes.io/projected/e827c1af-bb51-4f3d-bf81-708986989404-kube-api-access-sgllw\") pod \"redhat-marketplace-rhqsm\" (UID: \"e827c1af-bb51-4f3d-bf81-708986989404\") " pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.031087 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5nq4d"] Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.033255 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.035167 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.038018 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebd9754c-6bff-490f-a8c5-5aa16bb9170e-catalog-content\") pod \"certified-operators-5nq4d\" (UID: \"ebd9754c-6bff-490f-a8c5-5aa16bb9170e\") " pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.038129 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebd9754c-6bff-490f-a8c5-5aa16bb9170e-utilities\") pod \"certified-operators-5nq4d\" (UID: \"ebd9754c-6bff-490f-a8c5-5aa16bb9170e\") " pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.038174 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc6cb\" (UniqueName: \"kubernetes.io/projected/ebd9754c-6bff-490f-a8c5-5aa16bb9170e-kube-api-access-lc6cb\") pod \"certified-operators-5nq4d\" (UID: \"ebd9754c-6bff-490f-a8c5-5aa16bb9170e\") " pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.038930 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5nq4d"] Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.139426 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebd9754c-6bff-490f-a8c5-5aa16bb9170e-utilities\") pod \"certified-operators-5nq4d\" (UID: \"ebd9754c-6bff-490f-a8c5-5aa16bb9170e\") " pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.139782 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc6cb\" (UniqueName: \"kubernetes.io/projected/ebd9754c-6bff-490f-a8c5-5aa16bb9170e-kube-api-access-lc6cb\") pod \"certified-operators-5nq4d\" (UID: \"ebd9754c-6bff-490f-a8c5-5aa16bb9170e\") " pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.139962 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebd9754c-6bff-490f-a8c5-5aa16bb9170e-catalog-content\") pod \"certified-operators-5nq4d\" (UID: \"ebd9754c-6bff-490f-a8c5-5aa16bb9170e\") " pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.139968 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebd9754c-6bff-490f-a8c5-5aa16bb9170e-utilities\") pod \"certified-operators-5nq4d\" (UID: \"ebd9754c-6bff-490f-a8c5-5aa16bb9170e\") " pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.140202 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebd9754c-6bff-490f-a8c5-5aa16bb9170e-catalog-content\") pod \"certified-operators-5nq4d\" (UID: \"ebd9754c-6bff-490f-a8c5-5aa16bb9170e\") " pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.151379 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.155239 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc6cb\" (UniqueName: \"kubernetes.io/projected/ebd9754c-6bff-490f-a8c5-5aa16bb9170e-kube-api-access-lc6cb\") pod \"certified-operators-5nq4d\" (UID: \"ebd9754c-6bff-490f-a8c5-5aa16bb9170e\") " pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.354648 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.358020 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rhqsm"] Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.519083 4742 generic.go:334] "Generic (PLEG): container finished" podID="e827c1af-bb51-4f3d-bf81-708986989404" containerID="2117a7aab0979cf3f0d896a7a3f3a0437676286132881b36d469b2d630a6766d" exitCode=0 Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.519447 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhqsm" event={"ID":"e827c1af-bb51-4f3d-bf81-708986989404","Type":"ContainerDied","Data":"2117a7aab0979cf3f0d896a7a3f3a0437676286132881b36d469b2d630a6766d"} Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.519541 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhqsm" event={"ID":"e827c1af-bb51-4f3d-bf81-708986989404","Type":"ContainerStarted","Data":"7c6edcb4cee66868c0f9bd33a4438306999dc332202e2617c92f3ddf10ff1df9"} Mar 17 11:19:34 crc kubenswrapper[4742]: I0317 11:19:34.527475 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5nq4d"] Mar 17 11:19:34 crc kubenswrapper[4742]: W0317 11:19:34.563382 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebd9754c_6bff_490f_a8c5_5aa16bb9170e.slice/crio-8e85b272f4a1446a7db1bf5888bf109dff7b9feaf0e56a1c01afe1ce2ce49bee WatchSource:0}: Error finding container 8e85b272f4a1446a7db1bf5888bf109dff7b9feaf0e56a1c01afe1ce2ce49bee: Status 404 returned error can't find the container with id 8e85b272f4a1446a7db1bf5888bf109dff7b9feaf0e56a1c01afe1ce2ce49bee Mar 17 11:19:35 crc kubenswrapper[4742]: I0317 11:19:35.539460 4742 generic.go:334] "Generic (PLEG): container finished" podID="ebd9754c-6bff-490f-a8c5-5aa16bb9170e" containerID="11a5ff1e4ec199410005fcb65780dbb488413cad2c4cf68888df27c2d4f6c8e8" exitCode=0 Mar 17 11:19:35 crc kubenswrapper[4742]: I0317 11:19:35.540489 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nq4d" event={"ID":"ebd9754c-6bff-490f-a8c5-5aa16bb9170e","Type":"ContainerDied","Data":"11a5ff1e4ec199410005fcb65780dbb488413cad2c4cf68888df27c2d4f6c8e8"} Mar 17 11:19:35 crc kubenswrapper[4742]: I0317 11:19:35.540530 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nq4d" event={"ID":"ebd9754c-6bff-490f-a8c5-5aa16bb9170e","Type":"ContainerStarted","Data":"8e85b272f4a1446a7db1bf5888bf109dff7b9feaf0e56a1c01afe1ce2ce49bee"} Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.233414 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p52h7"] Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.242519 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.242650 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p52h7"] Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.245260 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.272622 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52e996e-9305-4a8a-bb51-9d2d72223dcf-utilities\") pod \"redhat-operators-p52h7\" (UID: \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\") " pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.272663 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52e996e-9305-4a8a-bb51-9d2d72223dcf-catalog-content\") pod \"redhat-operators-p52h7\" (UID: \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\") " pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.272705 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fntfc\" (UniqueName: \"kubernetes.io/projected/a52e996e-9305-4a8a-bb51-9d2d72223dcf-kube-api-access-fntfc\") pod \"redhat-operators-p52h7\" (UID: \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\") " pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.373594 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52e996e-9305-4a8a-bb51-9d2d72223dcf-utilities\") pod \"redhat-operators-p52h7\" (UID: \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\") " pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.373648 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52e996e-9305-4a8a-bb51-9d2d72223dcf-catalog-content\") pod \"redhat-operators-p52h7\" (UID: \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\") " pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.373793 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fntfc\" (UniqueName: \"kubernetes.io/projected/a52e996e-9305-4a8a-bb51-9d2d72223dcf-kube-api-access-fntfc\") pod \"redhat-operators-p52h7\" (UID: \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\") " pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.374314 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52e996e-9305-4a8a-bb51-9d2d72223dcf-utilities\") pod \"redhat-operators-p52h7\" (UID: \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\") " pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.374503 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52e996e-9305-4a8a-bb51-9d2d72223dcf-catalog-content\") pod \"redhat-operators-p52h7\" (UID: \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\") " pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.407944 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fntfc\" (UniqueName: \"kubernetes.io/projected/a52e996e-9305-4a8a-bb51-9d2d72223dcf-kube-api-access-fntfc\") pod \"redhat-operators-p52h7\" (UID: \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\") " pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.433193 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vnvhp"] Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.434277 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.436128 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.439473 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vnvhp"] Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.546433 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nq4d" event={"ID":"ebd9754c-6bff-490f-a8c5-5aa16bb9170e","Type":"ContainerStarted","Data":"645efc4e49198b54983bce31f799c2aac9658bef2010262eaee5df51f8a2b2ab"} Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.549130 4742 generic.go:334] "Generic (PLEG): container finished" podID="e827c1af-bb51-4f3d-bf81-708986989404" containerID="fdcb10fddb508681501fbac14214a5fda3f4b381ea539b40410d0d72ab401fa8" exitCode=0 Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.549170 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhqsm" event={"ID":"e827c1af-bb51-4f3d-bf81-708986989404","Type":"ContainerDied","Data":"fdcb10fddb508681501fbac14214a5fda3f4b381ea539b40410d0d72ab401fa8"} Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.576830 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdgd2\" (UniqueName: \"kubernetes.io/projected/5b587550-1bc7-4b07-a980-39f876baec8f-kube-api-access-qdgd2\") pod \"community-operators-vnvhp\" (UID: \"5b587550-1bc7-4b07-a980-39f876baec8f\") " pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.576964 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b587550-1bc7-4b07-a980-39f876baec8f-utilities\") pod \"community-operators-vnvhp\" (UID: \"5b587550-1bc7-4b07-a980-39f876baec8f\") " pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.576982 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b587550-1bc7-4b07-a980-39f876baec8f-catalog-content\") pod \"community-operators-vnvhp\" (UID: \"5b587550-1bc7-4b07-a980-39f876baec8f\") " pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.590470 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.677948 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b587550-1bc7-4b07-a980-39f876baec8f-utilities\") pod \"community-operators-vnvhp\" (UID: \"5b587550-1bc7-4b07-a980-39f876baec8f\") " pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.678244 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b587550-1bc7-4b07-a980-39f876baec8f-catalog-content\") pod \"community-operators-vnvhp\" (UID: \"5b587550-1bc7-4b07-a980-39f876baec8f\") " pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.678346 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdgd2\" (UniqueName: \"kubernetes.io/projected/5b587550-1bc7-4b07-a980-39f876baec8f-kube-api-access-qdgd2\") pod \"community-operators-vnvhp\" (UID: \"5b587550-1bc7-4b07-a980-39f876baec8f\") " pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.678447 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b587550-1bc7-4b07-a980-39f876baec8f-utilities\") pod \"community-operators-vnvhp\" (UID: \"5b587550-1bc7-4b07-a980-39f876baec8f\") " pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.678477 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b587550-1bc7-4b07-a980-39f876baec8f-catalog-content\") pod \"community-operators-vnvhp\" (UID: \"5b587550-1bc7-4b07-a980-39f876baec8f\") " pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.704755 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdgd2\" (UniqueName: \"kubernetes.io/projected/5b587550-1bc7-4b07-a980-39f876baec8f-kube-api-access-qdgd2\") pod \"community-operators-vnvhp\" (UID: \"5b587550-1bc7-4b07-a980-39f876baec8f\") " pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.757467 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.779408 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p52h7"] Mar 17 11:19:36 crc kubenswrapper[4742]: W0317 11:19:36.786058 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda52e996e_9305_4a8a_bb51_9d2d72223dcf.slice/crio-dbc2270af5ba8a5ca055d16f890c0f3f61f21ee36e4972292896ff3eaaccef54 WatchSource:0}: Error finding container dbc2270af5ba8a5ca055d16f890c0f3f61f21ee36e4972292896ff3eaaccef54: Status 404 returned error can't find the container with id dbc2270af5ba8a5ca055d16f890c0f3f61f21ee36e4972292896ff3eaaccef54 Mar 17 11:19:36 crc kubenswrapper[4742]: I0317 11:19:36.940641 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vnvhp"] Mar 17 11:19:36 crc kubenswrapper[4742]: W0317 11:19:36.948772 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b587550_1bc7_4b07_a980_39f876baec8f.slice/crio-3c2b8dfeff7941f3bebbd462a92dfb224c305517810c94052373e17980f8e91e WatchSource:0}: Error finding container 3c2b8dfeff7941f3bebbd462a92dfb224c305517810c94052373e17980f8e91e: Status 404 returned error can't find the container with id 3c2b8dfeff7941f3bebbd462a92dfb224c305517810c94052373e17980f8e91e Mar 17 11:19:37 crc kubenswrapper[4742]: I0317 11:19:37.557528 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rhqsm" event={"ID":"e827c1af-bb51-4f3d-bf81-708986989404","Type":"ContainerStarted","Data":"c46e21d0558bf9e7c82488fac5ac2e6899811132e64389a45fc70dd894fd555b"} Mar 17 11:19:37 crc kubenswrapper[4742]: I0317 11:19:37.559947 4742 generic.go:334] "Generic (PLEG): container finished" podID="a52e996e-9305-4a8a-bb51-9d2d72223dcf" containerID="626ca01c5ea635077f198085f767b451cdc90b535d7489db51a4162d4a1329ff" exitCode=0 Mar 17 11:19:37 crc kubenswrapper[4742]: I0317 11:19:37.560079 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p52h7" event={"ID":"a52e996e-9305-4a8a-bb51-9d2d72223dcf","Type":"ContainerDied","Data":"626ca01c5ea635077f198085f767b451cdc90b535d7489db51a4162d4a1329ff"} Mar 17 11:19:37 crc kubenswrapper[4742]: I0317 11:19:37.560158 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p52h7" event={"ID":"a52e996e-9305-4a8a-bb51-9d2d72223dcf","Type":"ContainerStarted","Data":"dbc2270af5ba8a5ca055d16f890c0f3f61f21ee36e4972292896ff3eaaccef54"} Mar 17 11:19:37 crc kubenswrapper[4742]: I0317 11:19:37.561760 4742 generic.go:334] "Generic (PLEG): container finished" podID="5b587550-1bc7-4b07-a980-39f876baec8f" containerID="501807a2e7f165e013e1b9af07f52f04b2cfef3778b399df487ba77ddb0cbeea" exitCode=0 Mar 17 11:19:37 crc kubenswrapper[4742]: I0317 11:19:37.561812 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnvhp" event={"ID":"5b587550-1bc7-4b07-a980-39f876baec8f","Type":"ContainerDied","Data":"501807a2e7f165e013e1b9af07f52f04b2cfef3778b399df487ba77ddb0cbeea"} Mar 17 11:19:37 crc kubenswrapper[4742]: I0317 11:19:37.561834 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnvhp" event={"ID":"5b587550-1bc7-4b07-a980-39f876baec8f","Type":"ContainerStarted","Data":"3c2b8dfeff7941f3bebbd462a92dfb224c305517810c94052373e17980f8e91e"} Mar 17 11:19:37 crc kubenswrapper[4742]: I0317 11:19:37.564077 4742 generic.go:334] "Generic (PLEG): container finished" podID="ebd9754c-6bff-490f-a8c5-5aa16bb9170e" containerID="645efc4e49198b54983bce31f799c2aac9658bef2010262eaee5df51f8a2b2ab" exitCode=0 Mar 17 11:19:37 crc kubenswrapper[4742]: I0317 11:19:37.564113 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nq4d" event={"ID":"ebd9754c-6bff-490f-a8c5-5aa16bb9170e","Type":"ContainerDied","Data":"645efc4e49198b54983bce31f799c2aac9658bef2010262eaee5df51f8a2b2ab"} Mar 17 11:19:37 crc kubenswrapper[4742]: I0317 11:19:37.578942 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rhqsm" podStartSLOduration=2.153523307 podStartE2EDuration="4.578924776s" podCreationTimestamp="2026-03-17 11:19:33 +0000 UTC" firstStartedPulling="2026-03-17 11:19:34.523466463 +0000 UTC m=+477.649594221" lastFinishedPulling="2026-03-17 11:19:36.948867912 +0000 UTC m=+480.074995690" observedRunningTime="2026-03-17 11:19:37.574838748 +0000 UTC m=+480.700966516" watchObservedRunningTime="2026-03-17 11:19:37.578924776 +0000 UTC m=+480.705052534" Mar 17 11:19:38 crc kubenswrapper[4742]: I0317 11:19:38.572069 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5nq4d" event={"ID":"ebd9754c-6bff-490f-a8c5-5aa16bb9170e","Type":"ContainerStarted","Data":"762986bb7a794368a8a410a776cb197359983edfb3485f31d5f5e4ffd9fa52a7"} Mar 17 11:19:38 crc kubenswrapper[4742]: I0317 11:19:38.573616 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p52h7" event={"ID":"a52e996e-9305-4a8a-bb51-9d2d72223dcf","Type":"ContainerStarted","Data":"c6aabaf8b6c4c8c65a34c496d74c40cf04158398bb78638a263ce9a1f19f5816"} Mar 17 11:19:38 crc kubenswrapper[4742]: I0317 11:19:38.575668 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnvhp" event={"ID":"5b587550-1bc7-4b07-a980-39f876baec8f","Type":"ContainerStarted","Data":"f2754896cf30f6ae224faf876b8ee131ba6ecb139c1f9bc3967fc8b1744f879e"} Mar 17 11:19:38 crc kubenswrapper[4742]: I0317 11:19:38.591969 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5nq4d" podStartSLOduration=2.18528701 podStartE2EDuration="4.59195467s" podCreationTimestamp="2026-03-17 11:19:34 +0000 UTC" firstStartedPulling="2026-03-17 11:19:35.543020785 +0000 UTC m=+478.669148583" lastFinishedPulling="2026-03-17 11:19:37.949688485 +0000 UTC m=+481.075816243" observedRunningTime="2026-03-17 11:19:38.589199501 +0000 UTC m=+481.715327279" watchObservedRunningTime="2026-03-17 11:19:38.59195467 +0000 UTC m=+481.718082428" Mar 17 11:19:39 crc kubenswrapper[4742]: I0317 11:19:39.587645 4742 generic.go:334] "Generic (PLEG): container finished" podID="a52e996e-9305-4a8a-bb51-9d2d72223dcf" containerID="c6aabaf8b6c4c8c65a34c496d74c40cf04158398bb78638a263ce9a1f19f5816" exitCode=0 Mar 17 11:19:39 crc kubenswrapper[4742]: I0317 11:19:39.587751 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p52h7" event={"ID":"a52e996e-9305-4a8a-bb51-9d2d72223dcf","Type":"ContainerDied","Data":"c6aabaf8b6c4c8c65a34c496d74c40cf04158398bb78638a263ce9a1f19f5816"} Mar 17 11:19:39 crc kubenswrapper[4742]: I0317 11:19:39.605690 4742 generic.go:334] "Generic (PLEG): container finished" podID="5b587550-1bc7-4b07-a980-39f876baec8f" containerID="f2754896cf30f6ae224faf876b8ee131ba6ecb139c1f9bc3967fc8b1744f879e" exitCode=0 Mar 17 11:19:39 crc kubenswrapper[4742]: I0317 11:19:39.605787 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnvhp" event={"ID":"5b587550-1bc7-4b07-a980-39f876baec8f","Type":"ContainerDied","Data":"f2754896cf30f6ae224faf876b8ee131ba6ecb139c1f9bc3967fc8b1744f879e"} Mar 17 11:19:40 crc kubenswrapper[4742]: I0317 11:19:40.615434 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p52h7" event={"ID":"a52e996e-9305-4a8a-bb51-9d2d72223dcf","Type":"ContainerStarted","Data":"4b75ee601018482e051623b5f21f20616250bd08c568777a7842b09df1c61585"} Mar 17 11:19:40 crc kubenswrapper[4742]: I0317 11:19:40.619541 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnvhp" event={"ID":"5b587550-1bc7-4b07-a980-39f876baec8f","Type":"ContainerStarted","Data":"5f1fe5e0096af1d3d10bf091f7b3ab14912740705d0dd65f4cba83e381922946"} Mar 17 11:19:40 crc kubenswrapper[4742]: I0317 11:19:40.680323 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p52h7" podStartSLOduration=2.202778022 podStartE2EDuration="4.680308449s" podCreationTimestamp="2026-03-17 11:19:36 +0000 UTC" firstStartedPulling="2026-03-17 11:19:37.561600968 +0000 UTC m=+480.687728726" lastFinishedPulling="2026-03-17 11:19:40.039131385 +0000 UTC m=+483.165259153" observedRunningTime="2026-03-17 11:19:40.653737435 +0000 UTC m=+483.779865203" watchObservedRunningTime="2026-03-17 11:19:40.680308449 +0000 UTC m=+483.806436207" Mar 17 11:19:40 crc kubenswrapper[4742]: I0317 11:19:40.680600 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vnvhp" podStartSLOduration=2.175995461 podStartE2EDuration="4.680596767s" podCreationTimestamp="2026-03-17 11:19:36 +0000 UTC" firstStartedPulling="2026-03-17 11:19:37.562988308 +0000 UTC m=+480.689116066" lastFinishedPulling="2026-03-17 11:19:40.067589594 +0000 UTC m=+483.193717372" observedRunningTime="2026-03-17 11:19:40.676276363 +0000 UTC m=+483.802404161" watchObservedRunningTime="2026-03-17 11:19:40.680596767 +0000 UTC m=+483.806724525" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.312988 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" podUID="6b38516a-3938-421e-9191-03786c23318c" containerName="registry" containerID="cri-o://0e55ea87007e27cbabd32b962fba400b272f4d7478a340f5715c6574942a8890" gracePeriod=30 Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.642949 4742 generic.go:334] "Generic (PLEG): container finished" podID="6b38516a-3938-421e-9191-03786c23318c" containerID="0e55ea87007e27cbabd32b962fba400b272f4d7478a340f5715c6574942a8890" exitCode=0 Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.643007 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" event={"ID":"6b38516a-3938-421e-9191-03786c23318c","Type":"ContainerDied","Data":"0e55ea87007e27cbabd32b962fba400b272f4d7478a340f5715c6574942a8890"} Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.748880 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.882453 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"6b38516a-3938-421e-9191-03786c23318c\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.882517 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kll7c\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-kube-api-access-kll7c\") pod \"6b38516a-3938-421e-9191-03786c23318c\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.882552 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b38516a-3938-421e-9191-03786c23318c-registry-certificates\") pod \"6b38516a-3938-421e-9191-03786c23318c\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.882571 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b38516a-3938-421e-9191-03786c23318c-ca-trust-extracted\") pod \"6b38516a-3938-421e-9191-03786c23318c\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.882606 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-registry-tls\") pod \"6b38516a-3938-421e-9191-03786c23318c\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.882632 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b38516a-3938-421e-9191-03786c23318c-installation-pull-secrets\") pod \"6b38516a-3938-421e-9191-03786c23318c\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.882647 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b38516a-3938-421e-9191-03786c23318c-trusted-ca\") pod \"6b38516a-3938-421e-9191-03786c23318c\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.882666 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-bound-sa-token\") pod \"6b38516a-3938-421e-9191-03786c23318c\" (UID: \"6b38516a-3938-421e-9191-03786c23318c\") " Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.884892 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b38516a-3938-421e-9191-03786c23318c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6b38516a-3938-421e-9191-03786c23318c" (UID: "6b38516a-3938-421e-9191-03786c23318c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.885002 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b38516a-3938-421e-9191-03786c23318c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6b38516a-3938-421e-9191-03786c23318c" (UID: "6b38516a-3938-421e-9191-03786c23318c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.891449 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-kube-api-access-kll7c" (OuterVolumeSpecName: "kube-api-access-kll7c") pod "6b38516a-3938-421e-9191-03786c23318c" (UID: "6b38516a-3938-421e-9191-03786c23318c"). InnerVolumeSpecName "kube-api-access-kll7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.905064 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6b38516a-3938-421e-9191-03786c23318c" (UID: "6b38516a-3938-421e-9191-03786c23318c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.905539 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6b38516a-3938-421e-9191-03786c23318c" (UID: "6b38516a-3938-421e-9191-03786c23318c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.909671 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b38516a-3938-421e-9191-03786c23318c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6b38516a-3938-421e-9191-03786c23318c" (UID: "6b38516a-3938-421e-9191-03786c23318c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.911401 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b38516a-3938-421e-9191-03786c23318c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6b38516a-3938-421e-9191-03786c23318c" (UID: "6b38516a-3938-421e-9191-03786c23318c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.916989 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "6b38516a-3938-421e-9191-03786c23318c" (UID: "6b38516a-3938-421e-9191-03786c23318c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.984257 4742 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b38516a-3938-421e-9191-03786c23318c-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.984571 4742 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.984584 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kll7c\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-kube-api-access-kll7c\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.984596 4742 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b38516a-3938-421e-9191-03786c23318c-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.984607 4742 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b38516a-3938-421e-9191-03786c23318c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.984620 4742 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b38516a-3938-421e-9191-03786c23318c-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:43 crc kubenswrapper[4742]: I0317 11:19:43.984632 4742 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b38516a-3938-421e-9191-03786c23318c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 17 11:19:44 crc kubenswrapper[4742]: I0317 11:19:44.152360 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:44 crc kubenswrapper[4742]: I0317 11:19:44.152409 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:44 crc kubenswrapper[4742]: I0317 11:19:44.225475 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:44 crc kubenswrapper[4742]: I0317 11:19:44.356510 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:44 crc kubenswrapper[4742]: I0317 11:19:44.356579 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:44 crc kubenswrapper[4742]: I0317 11:19:44.410173 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:44 crc kubenswrapper[4742]: I0317 11:19:44.654800 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" Mar 17 11:19:44 crc kubenswrapper[4742]: I0317 11:19:44.654832 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-9lz9n" event={"ID":"6b38516a-3938-421e-9191-03786c23318c","Type":"ContainerDied","Data":"8dc68984417269e3c1b938664797e8757d61c155cf838069ac66a2e6a5dc48bf"} Mar 17 11:19:44 crc kubenswrapper[4742]: I0317 11:19:44.654977 4742 scope.go:117] "RemoveContainer" containerID="0e55ea87007e27cbabd32b962fba400b272f4d7478a340f5715c6574942a8890" Mar 17 11:19:44 crc kubenswrapper[4742]: I0317 11:19:44.696007 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9lz9n"] Mar 17 11:19:44 crc kubenswrapper[4742]: I0317 11:19:44.696039 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-9lz9n"] Mar 17 11:19:44 crc kubenswrapper[4742]: I0317 11:19:44.727045 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rhqsm" Mar 17 11:19:44 crc kubenswrapper[4742]: I0317 11:19:44.740697 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5nq4d" Mar 17 11:19:46 crc kubenswrapper[4742]: I0317 11:19:46.591025 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:46 crc kubenswrapper[4742]: I0317 11:19:46.591277 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:46 crc kubenswrapper[4742]: I0317 11:19:46.671437 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b38516a-3938-421e-9191-03786c23318c" path="/var/lib/kubelet/pods/6b38516a-3938-421e-9191-03786c23318c/volumes" Mar 17 11:19:46 crc kubenswrapper[4742]: I0317 11:19:46.758490 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:46 crc kubenswrapper[4742]: I0317 11:19:46.758541 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:46 crc kubenswrapper[4742]: I0317 11:19:46.808523 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:47 crc kubenswrapper[4742]: I0317 11:19:47.637386 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p52h7" podUID="a52e996e-9305-4a8a-bb51-9d2d72223dcf" containerName="registry-server" probeResult="failure" output=< Mar 17 11:19:47 crc kubenswrapper[4742]: timeout: failed to connect service ":50051" within 1s Mar 17 11:19:47 crc kubenswrapper[4742]: > Mar 17 11:19:47 crc kubenswrapper[4742]: I0317 11:19:47.712976 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:19:48 crc kubenswrapper[4742]: I0317 11:19:48.044398 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:19:48 crc kubenswrapper[4742]: I0317 11:19:48.044490 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:19:48 crc kubenswrapper[4742]: I0317 11:19:48.044550 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:19:48 crc kubenswrapper[4742]: I0317 11:19:48.045323 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d359fe986da000baf4416f14e6a6add8b7b7042aba869fb27193d50f2884a38b"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 11:19:48 crc kubenswrapper[4742]: I0317 11:19:48.045428 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://d359fe986da000baf4416f14e6a6add8b7b7042aba869fb27193d50f2884a38b" gracePeriod=600 Mar 17 11:19:48 crc kubenswrapper[4742]: I0317 11:19:48.680205 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="d359fe986da000baf4416f14e6a6add8b7b7042aba869fb27193d50f2884a38b" exitCode=0 Mar 17 11:19:48 crc kubenswrapper[4742]: I0317 11:19:48.680311 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"d359fe986da000baf4416f14e6a6add8b7b7042aba869fb27193d50f2884a38b"} Mar 17 11:19:48 crc kubenswrapper[4742]: I0317 11:19:48.680787 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"c79ebe7c8568e968315882c89bb4596b0d3698f3b81bf528102fe1275360f30f"} Mar 17 11:19:48 crc kubenswrapper[4742]: I0317 11:19:48.680816 4742 scope.go:117] "RemoveContainer" containerID="4f44d1a8389879ee7405ceeacc13893813282f4efbb8c0200475a845aacee092" Mar 17 11:19:56 crc kubenswrapper[4742]: I0317 11:19:56.673320 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:19:56 crc kubenswrapper[4742]: I0317 11:19:56.730444 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 11:20:00 crc kubenswrapper[4742]: I0317 11:20:00.129267 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562440-nvhsf"] Mar 17 11:20:00 crc kubenswrapper[4742]: E0317 11:20:00.130073 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b38516a-3938-421e-9191-03786c23318c" containerName="registry" Mar 17 11:20:00 crc kubenswrapper[4742]: I0317 11:20:00.130090 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b38516a-3938-421e-9191-03786c23318c" containerName="registry" Mar 17 11:20:00 crc kubenswrapper[4742]: I0317 11:20:00.130231 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b38516a-3938-421e-9191-03786c23318c" containerName="registry" Mar 17 11:20:00 crc kubenswrapper[4742]: I0317 11:20:00.130659 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562440-nvhsf" Mar 17 11:20:00 crc kubenswrapper[4742]: I0317 11:20:00.132717 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:20:00 crc kubenswrapper[4742]: I0317 11:20:00.133000 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:20:00 crc kubenswrapper[4742]: I0317 11:20:00.133168 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:20:00 crc kubenswrapper[4742]: I0317 11:20:00.134316 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562440-nvhsf"] Mar 17 11:20:00 crc kubenswrapper[4742]: I0317 11:20:00.295460 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdblx\" (UniqueName: \"kubernetes.io/projected/b9cb0185-95a0-4694-8dcd-b75801842648-kube-api-access-rdblx\") pod \"auto-csr-approver-29562440-nvhsf\" (UID: \"b9cb0185-95a0-4694-8dcd-b75801842648\") " pod="openshift-infra/auto-csr-approver-29562440-nvhsf" Mar 17 11:20:00 crc kubenswrapper[4742]: I0317 11:20:00.396974 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdblx\" (UniqueName: \"kubernetes.io/projected/b9cb0185-95a0-4694-8dcd-b75801842648-kube-api-access-rdblx\") pod \"auto-csr-approver-29562440-nvhsf\" (UID: \"b9cb0185-95a0-4694-8dcd-b75801842648\") " pod="openshift-infra/auto-csr-approver-29562440-nvhsf" Mar 17 11:20:00 crc kubenswrapper[4742]: I0317 11:20:00.421606 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdblx\" (UniqueName: \"kubernetes.io/projected/b9cb0185-95a0-4694-8dcd-b75801842648-kube-api-access-rdblx\") pod \"auto-csr-approver-29562440-nvhsf\" (UID: \"b9cb0185-95a0-4694-8dcd-b75801842648\") " pod="openshift-infra/auto-csr-approver-29562440-nvhsf" Mar 17 11:20:00 crc kubenswrapper[4742]: I0317 11:20:00.450845 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562440-nvhsf" Mar 17 11:20:00 crc kubenswrapper[4742]: I0317 11:20:00.847011 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562440-nvhsf"] Mar 17 11:20:00 crc kubenswrapper[4742]: W0317 11:20:00.858837 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9cb0185_95a0_4694_8dcd_b75801842648.slice/crio-f4f6414d4af19097ec21dee52e2cdcf1bbb5d6fc8a8eb10ee57cf653e328298c WatchSource:0}: Error finding container f4f6414d4af19097ec21dee52e2cdcf1bbb5d6fc8a8eb10ee57cf653e328298c: Status 404 returned error can't find the container with id f4f6414d4af19097ec21dee52e2cdcf1bbb5d6fc8a8eb10ee57cf653e328298c Mar 17 11:20:01 crc kubenswrapper[4742]: I0317 11:20:01.784421 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562440-nvhsf" event={"ID":"b9cb0185-95a0-4694-8dcd-b75801842648","Type":"ContainerStarted","Data":"f4f6414d4af19097ec21dee52e2cdcf1bbb5d6fc8a8eb10ee57cf653e328298c"} Mar 17 11:20:02 crc kubenswrapper[4742]: I0317 11:20:02.790949 4742 generic.go:334] "Generic (PLEG): container finished" podID="b9cb0185-95a0-4694-8dcd-b75801842648" containerID="ed15874775926665a5f3c5e51e46899cef9e24dcdb55aa758011f2ed5e03a40f" exitCode=0 Mar 17 11:20:02 crc kubenswrapper[4742]: I0317 11:20:02.791220 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562440-nvhsf" event={"ID":"b9cb0185-95a0-4694-8dcd-b75801842648","Type":"ContainerDied","Data":"ed15874775926665a5f3c5e51e46899cef9e24dcdb55aa758011f2ed5e03a40f"} Mar 17 11:20:04 crc kubenswrapper[4742]: I0317 11:20:04.103927 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562440-nvhsf" Mar 17 11:20:04 crc kubenswrapper[4742]: I0317 11:20:04.286769 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdblx\" (UniqueName: \"kubernetes.io/projected/b9cb0185-95a0-4694-8dcd-b75801842648-kube-api-access-rdblx\") pod \"b9cb0185-95a0-4694-8dcd-b75801842648\" (UID: \"b9cb0185-95a0-4694-8dcd-b75801842648\") " Mar 17 11:20:04 crc kubenswrapper[4742]: I0317 11:20:04.295086 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9cb0185-95a0-4694-8dcd-b75801842648-kube-api-access-rdblx" (OuterVolumeSpecName: "kube-api-access-rdblx") pod "b9cb0185-95a0-4694-8dcd-b75801842648" (UID: "b9cb0185-95a0-4694-8dcd-b75801842648"). InnerVolumeSpecName "kube-api-access-rdblx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:20:04 crc kubenswrapper[4742]: I0317 11:20:04.390833 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdblx\" (UniqueName: \"kubernetes.io/projected/b9cb0185-95a0-4694-8dcd-b75801842648-kube-api-access-rdblx\") on node \"crc\" DevicePath \"\"" Mar 17 11:20:04 crc kubenswrapper[4742]: I0317 11:20:04.803382 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562440-nvhsf" event={"ID":"b9cb0185-95a0-4694-8dcd-b75801842648","Type":"ContainerDied","Data":"f4f6414d4af19097ec21dee52e2cdcf1bbb5d6fc8a8eb10ee57cf653e328298c"} Mar 17 11:20:04 crc kubenswrapper[4742]: I0317 11:20:04.803769 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4f6414d4af19097ec21dee52e2cdcf1bbb5d6fc8a8eb10ee57cf653e328298c" Mar 17 11:20:04 crc kubenswrapper[4742]: I0317 11:20:04.803513 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562440-nvhsf" Mar 17 11:20:05 crc kubenswrapper[4742]: I0317 11:20:05.158644 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562434-wtx87"] Mar 17 11:20:05 crc kubenswrapper[4742]: I0317 11:20:05.162596 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562434-wtx87"] Mar 17 11:20:06 crc kubenswrapper[4742]: I0317 11:20:06.669734 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3a8612-a5db-4ec8-9873-32829e2fe69e" path="/var/lib/kubelet/pods/5b3a8612-a5db-4ec8-9873-32829e2fe69e/volumes" Mar 17 11:20:10 crc kubenswrapper[4742]: I0317 11:20:10.878825 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 17 11:21:48 crc kubenswrapper[4742]: I0317 11:21:48.044260 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:21:48 crc kubenswrapper[4742]: I0317 11:21:48.045659 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.153357 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562442-cnzcc"] Mar 17 11:22:00 crc kubenswrapper[4742]: E0317 11:22:00.156063 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9cb0185-95a0-4694-8dcd-b75801842648" containerName="oc" Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.156091 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9cb0185-95a0-4694-8dcd-b75801842648" containerName="oc" Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.156289 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9cb0185-95a0-4694-8dcd-b75801842648" containerName="oc" Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.156949 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562442-cnzcc" Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.160661 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.160678 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.160678 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.169895 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562442-cnzcc"] Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.257993 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2ffj\" (UniqueName: \"kubernetes.io/projected/d9ad19b9-d849-4cb6-9ac9-3a35f9de9927-kube-api-access-n2ffj\") pod \"auto-csr-approver-29562442-cnzcc\" (UID: \"d9ad19b9-d849-4cb6-9ac9-3a35f9de9927\") " pod="openshift-infra/auto-csr-approver-29562442-cnzcc" Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.359319 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2ffj\" (UniqueName: \"kubernetes.io/projected/d9ad19b9-d849-4cb6-9ac9-3a35f9de9927-kube-api-access-n2ffj\") pod \"auto-csr-approver-29562442-cnzcc\" (UID: \"d9ad19b9-d849-4cb6-9ac9-3a35f9de9927\") " pod="openshift-infra/auto-csr-approver-29562442-cnzcc" Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.384734 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2ffj\" (UniqueName: \"kubernetes.io/projected/d9ad19b9-d849-4cb6-9ac9-3a35f9de9927-kube-api-access-n2ffj\") pod \"auto-csr-approver-29562442-cnzcc\" (UID: \"d9ad19b9-d849-4cb6-9ac9-3a35f9de9927\") " pod="openshift-infra/auto-csr-approver-29562442-cnzcc" Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.494008 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562442-cnzcc" Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.707457 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562442-cnzcc"] Mar 17 11:22:00 crc kubenswrapper[4742]: I0317 11:22:00.721597 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 11:22:01 crc kubenswrapper[4742]: I0317 11:22:01.614474 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562442-cnzcc" event={"ID":"d9ad19b9-d849-4cb6-9ac9-3a35f9de9927","Type":"ContainerStarted","Data":"ab34439628301702f1e4aff54dfcf67f39c1ee5e8eea757e490f75c88c58162b"} Mar 17 11:22:02 crc kubenswrapper[4742]: I0317 11:22:02.624747 4742 generic.go:334] "Generic (PLEG): container finished" podID="d9ad19b9-d849-4cb6-9ac9-3a35f9de9927" containerID="b5fdcc36049c8777522ecab712ae2e1e1abacf4d5942a333133dec78d5882702" exitCode=0 Mar 17 11:22:02 crc kubenswrapper[4742]: I0317 11:22:02.624826 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562442-cnzcc" event={"ID":"d9ad19b9-d849-4cb6-9ac9-3a35f9de9927","Type":"ContainerDied","Data":"b5fdcc36049c8777522ecab712ae2e1e1abacf4d5942a333133dec78d5882702"} Mar 17 11:22:03 crc kubenswrapper[4742]: I0317 11:22:03.879785 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562442-cnzcc" Mar 17 11:22:04 crc kubenswrapper[4742]: I0317 11:22:04.027845 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2ffj\" (UniqueName: \"kubernetes.io/projected/d9ad19b9-d849-4cb6-9ac9-3a35f9de9927-kube-api-access-n2ffj\") pod \"d9ad19b9-d849-4cb6-9ac9-3a35f9de9927\" (UID: \"d9ad19b9-d849-4cb6-9ac9-3a35f9de9927\") " Mar 17 11:22:04 crc kubenswrapper[4742]: I0317 11:22:04.034554 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ad19b9-d849-4cb6-9ac9-3a35f9de9927-kube-api-access-n2ffj" (OuterVolumeSpecName: "kube-api-access-n2ffj") pod "d9ad19b9-d849-4cb6-9ac9-3a35f9de9927" (UID: "d9ad19b9-d849-4cb6-9ac9-3a35f9de9927"). InnerVolumeSpecName "kube-api-access-n2ffj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:22:04 crc kubenswrapper[4742]: I0317 11:22:04.130507 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2ffj\" (UniqueName: \"kubernetes.io/projected/d9ad19b9-d849-4cb6-9ac9-3a35f9de9927-kube-api-access-n2ffj\") on node \"crc\" DevicePath \"\"" Mar 17 11:22:04 crc kubenswrapper[4742]: I0317 11:22:04.639273 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562442-cnzcc" event={"ID":"d9ad19b9-d849-4cb6-9ac9-3a35f9de9927","Type":"ContainerDied","Data":"ab34439628301702f1e4aff54dfcf67f39c1ee5e8eea757e490f75c88c58162b"} Mar 17 11:22:04 crc kubenswrapper[4742]: I0317 11:22:04.639346 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab34439628301702f1e4aff54dfcf67f39c1ee5e8eea757e490f75c88c58162b" Mar 17 11:22:04 crc kubenswrapper[4742]: I0317 11:22:04.639373 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562442-cnzcc" Mar 17 11:22:04 crc kubenswrapper[4742]: I0317 11:22:04.935128 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562436-cnnrt"] Mar 17 11:22:04 crc kubenswrapper[4742]: I0317 11:22:04.940550 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562436-cnnrt"] Mar 17 11:22:06 crc kubenswrapper[4742]: I0317 11:22:06.675199 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f99ba73f-1688-43ea-9538-bc7623c02521" path="/var/lib/kubelet/pods/f99ba73f-1688-43ea-9538-bc7623c02521/volumes" Mar 17 11:22:18 crc kubenswrapper[4742]: I0317 11:22:18.044514 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:22:18 crc kubenswrapper[4742]: I0317 11:22:18.045284 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:22:39 crc kubenswrapper[4742]: I0317 11:22:39.683984 4742 scope.go:117] "RemoveContainer" containerID="b43ac5c4c08e2b62cdaee4086a7c989df52d9423ccb4111fa8cb8bb2701e5648" Mar 17 11:22:39 crc kubenswrapper[4742]: I0317 11:22:39.735764 4742 scope.go:117] "RemoveContainer" containerID="3c8562a01c8b5d058ff3e1345ed1ecb4fae67d6e90929d34e54a3088d4ae1d5c" Mar 17 11:22:48 crc kubenswrapper[4742]: I0317 11:22:48.044213 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:22:48 crc kubenswrapper[4742]: I0317 11:22:48.044966 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:22:48 crc kubenswrapper[4742]: I0317 11:22:48.045040 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:22:48 crc kubenswrapper[4742]: I0317 11:22:48.045892 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c79ebe7c8568e968315882c89bb4596b0d3698f3b81bf528102fe1275360f30f"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 11:22:48 crc kubenswrapper[4742]: I0317 11:22:48.046027 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://c79ebe7c8568e968315882c89bb4596b0d3698f3b81bf528102fe1275360f30f" gracePeriod=600 Mar 17 11:22:48 crc kubenswrapper[4742]: I0317 11:22:48.972809 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="c79ebe7c8568e968315882c89bb4596b0d3698f3b81bf528102fe1275360f30f" exitCode=0 Mar 17 11:22:48 crc kubenswrapper[4742]: I0317 11:22:48.972957 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"c79ebe7c8568e968315882c89bb4596b0d3698f3b81bf528102fe1275360f30f"} Mar 17 11:22:48 crc kubenswrapper[4742]: I0317 11:22:48.973786 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"0b6d37342f3ee85fc8b1ee717e3e6b6ff2837c9e6e923cd75738d5afd1b0bd6d"} Mar 17 11:22:48 crc kubenswrapper[4742]: I0317 11:22:48.973827 4742 scope.go:117] "RemoveContainer" containerID="d359fe986da000baf4416f14e6a6add8b7b7042aba869fb27193d50f2884a38b" Mar 17 11:24:00 crc kubenswrapper[4742]: I0317 11:24:00.145886 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562444-9mtj5"] Mar 17 11:24:00 crc kubenswrapper[4742]: E0317 11:24:00.147467 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ad19b9-d849-4cb6-9ac9-3a35f9de9927" containerName="oc" Mar 17 11:24:00 crc kubenswrapper[4742]: I0317 11:24:00.147500 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ad19b9-d849-4cb6-9ac9-3a35f9de9927" containerName="oc" Mar 17 11:24:00 crc kubenswrapper[4742]: I0317 11:24:00.147713 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ad19b9-d849-4cb6-9ac9-3a35f9de9927" containerName="oc" Mar 17 11:24:00 crc kubenswrapper[4742]: I0317 11:24:00.148629 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562444-9mtj5" Mar 17 11:24:00 crc kubenswrapper[4742]: I0317 11:24:00.151775 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:24:00 crc kubenswrapper[4742]: I0317 11:24:00.152419 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:24:00 crc kubenswrapper[4742]: I0317 11:24:00.153285 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:24:00 crc kubenswrapper[4742]: I0317 11:24:00.156007 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562444-9mtj5"] Mar 17 11:24:00 crc kubenswrapper[4742]: I0317 11:24:00.276798 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjlnw\" (UniqueName: \"kubernetes.io/projected/520a6146-8525-4d61-bbc0-8fe2c576b266-kube-api-access-cjlnw\") pod \"auto-csr-approver-29562444-9mtj5\" (UID: \"520a6146-8525-4d61-bbc0-8fe2c576b266\") " pod="openshift-infra/auto-csr-approver-29562444-9mtj5" Mar 17 11:24:00 crc kubenswrapper[4742]: I0317 11:24:00.378539 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjlnw\" (UniqueName: \"kubernetes.io/projected/520a6146-8525-4d61-bbc0-8fe2c576b266-kube-api-access-cjlnw\") pod \"auto-csr-approver-29562444-9mtj5\" (UID: \"520a6146-8525-4d61-bbc0-8fe2c576b266\") " pod="openshift-infra/auto-csr-approver-29562444-9mtj5" Mar 17 11:24:00 crc kubenswrapper[4742]: I0317 11:24:00.411017 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjlnw\" (UniqueName: \"kubernetes.io/projected/520a6146-8525-4d61-bbc0-8fe2c576b266-kube-api-access-cjlnw\") pod \"auto-csr-approver-29562444-9mtj5\" (UID: \"520a6146-8525-4d61-bbc0-8fe2c576b266\") " pod="openshift-infra/auto-csr-approver-29562444-9mtj5" Mar 17 11:24:00 crc kubenswrapper[4742]: I0317 11:24:00.477219 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562444-9mtj5" Mar 17 11:24:00 crc kubenswrapper[4742]: I0317 11:24:00.781185 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562444-9mtj5"] Mar 17 11:24:01 crc kubenswrapper[4742]: I0317 11:24:01.490532 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562444-9mtj5" event={"ID":"520a6146-8525-4d61-bbc0-8fe2c576b266","Type":"ContainerStarted","Data":"6105269fbb248a177224ca0f53ef73ef65ec38c56d8744b055831897a4d3eaad"} Mar 17 11:24:03 crc kubenswrapper[4742]: I0317 11:24:03.507488 4742 generic.go:334] "Generic (PLEG): container finished" podID="520a6146-8525-4d61-bbc0-8fe2c576b266" containerID="6b21e189edc79bc7724b4a120234d0cc9adf3120f669a465905ed388680b1afe" exitCode=0 Mar 17 11:24:03 crc kubenswrapper[4742]: I0317 11:24:03.507842 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562444-9mtj5" event={"ID":"520a6146-8525-4d61-bbc0-8fe2c576b266","Type":"ContainerDied","Data":"6b21e189edc79bc7724b4a120234d0cc9adf3120f669a465905ed388680b1afe"} Mar 17 11:24:04 crc kubenswrapper[4742]: I0317 11:24:04.808077 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562444-9mtj5" Mar 17 11:24:04 crc kubenswrapper[4742]: I0317 11:24:04.848172 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjlnw\" (UniqueName: \"kubernetes.io/projected/520a6146-8525-4d61-bbc0-8fe2c576b266-kube-api-access-cjlnw\") pod \"520a6146-8525-4d61-bbc0-8fe2c576b266\" (UID: \"520a6146-8525-4d61-bbc0-8fe2c576b266\") " Mar 17 11:24:04 crc kubenswrapper[4742]: I0317 11:24:04.858201 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/520a6146-8525-4d61-bbc0-8fe2c576b266-kube-api-access-cjlnw" (OuterVolumeSpecName: "kube-api-access-cjlnw") pod "520a6146-8525-4d61-bbc0-8fe2c576b266" (UID: "520a6146-8525-4d61-bbc0-8fe2c576b266"). InnerVolumeSpecName "kube-api-access-cjlnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:24:04 crc kubenswrapper[4742]: I0317 11:24:04.950222 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjlnw\" (UniqueName: \"kubernetes.io/projected/520a6146-8525-4d61-bbc0-8fe2c576b266-kube-api-access-cjlnw\") on node \"crc\" DevicePath \"\"" Mar 17 11:24:05 crc kubenswrapper[4742]: I0317 11:24:05.520875 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562444-9mtj5" event={"ID":"520a6146-8525-4d61-bbc0-8fe2c576b266","Type":"ContainerDied","Data":"6105269fbb248a177224ca0f53ef73ef65ec38c56d8744b055831897a4d3eaad"} Mar 17 11:24:05 crc kubenswrapper[4742]: I0317 11:24:05.520926 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6105269fbb248a177224ca0f53ef73ef65ec38c56d8744b055831897a4d3eaad" Mar 17 11:24:05 crc kubenswrapper[4742]: I0317 11:24:05.520983 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562444-9mtj5" Mar 17 11:24:05 crc kubenswrapper[4742]: I0317 11:24:05.883648 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562438-9522d"] Mar 17 11:24:05 crc kubenswrapper[4742]: I0317 11:24:05.890536 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562438-9522d"] Mar 17 11:24:06 crc kubenswrapper[4742]: I0317 11:24:06.670151 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e08745d1-ba86-4a70-a6f0-c8108edb08b7" path="/var/lib/kubelet/pods/e08745d1-ba86-4a70-a6f0-c8108edb08b7/volumes" Mar 17 11:24:39 crc kubenswrapper[4742]: I0317 11:24:39.824262 4742 scope.go:117] "RemoveContainer" containerID="608de52b8994835e4794de9795c9480d78922f9e81dfca4b2b4d7d4393551adc" Mar 17 11:24:48 crc kubenswrapper[4742]: I0317 11:24:48.044421 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:24:48 crc kubenswrapper[4742]: I0317 11:24:48.045218 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.275493 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-k4cwb"] Mar 17 11:25:03 crc kubenswrapper[4742]: E0317 11:25:03.276213 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520a6146-8525-4d61-bbc0-8fe2c576b266" containerName="oc" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.276229 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="520a6146-8525-4d61-bbc0-8fe2c576b266" containerName="oc" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.276352 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="520a6146-8525-4d61-bbc0-8fe2c576b266" containerName="oc" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.276770 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-k4cwb" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.279566 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.279881 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.281129 4742 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-sspzm" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.297641 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-ncl69"] Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.298472 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ncl69" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.302726 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-k4cwb"] Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.304408 4742 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-qq9jp" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.317405 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ncl69"] Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.325421 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vf65m"] Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.326024 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vf65m" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.327584 4742 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-5gkv7" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.346397 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vf65m"] Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.346867 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drkzg\" (UniqueName: \"kubernetes.io/projected/09203846-9e2d-4748-b11f-c64b5a9c9c85-kube-api-access-drkzg\") pod \"cert-manager-webhook-687f57d79b-vf65m\" (UID: \"09203846-9e2d-4748-b11f-c64b5a9c9c85\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vf65m" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.346956 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jwt5\" (UniqueName: \"kubernetes.io/projected/fb8bea11-37f9-43cf-9a3c-07e54ebca5fa-kube-api-access-5jwt5\") pod \"cert-manager-858654f9db-ncl69\" (UID: \"fb8bea11-37f9-43cf-9a3c-07e54ebca5fa\") " pod="cert-manager/cert-manager-858654f9db-ncl69" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.347025 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrgdb\" (UniqueName: \"kubernetes.io/projected/a8125ed7-e435-4a7e-8b09-541af1b40820-kube-api-access-jrgdb\") pod \"cert-manager-cainjector-cf98fcc89-k4cwb\" (UID: \"a8125ed7-e435-4a7e-8b09-541af1b40820\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-k4cwb" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.447684 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drkzg\" (UniqueName: \"kubernetes.io/projected/09203846-9e2d-4748-b11f-c64b5a9c9c85-kube-api-access-drkzg\") pod \"cert-manager-webhook-687f57d79b-vf65m\" (UID: \"09203846-9e2d-4748-b11f-c64b5a9c9c85\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vf65m" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.447766 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jwt5\" (UniqueName: \"kubernetes.io/projected/fb8bea11-37f9-43cf-9a3c-07e54ebca5fa-kube-api-access-5jwt5\") pod \"cert-manager-858654f9db-ncl69\" (UID: \"fb8bea11-37f9-43cf-9a3c-07e54ebca5fa\") " pod="cert-manager/cert-manager-858654f9db-ncl69" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.447791 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrgdb\" (UniqueName: \"kubernetes.io/projected/a8125ed7-e435-4a7e-8b09-541af1b40820-kube-api-access-jrgdb\") pod \"cert-manager-cainjector-cf98fcc89-k4cwb\" (UID: \"a8125ed7-e435-4a7e-8b09-541af1b40820\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-k4cwb" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.464316 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drkzg\" (UniqueName: \"kubernetes.io/projected/09203846-9e2d-4748-b11f-c64b5a9c9c85-kube-api-access-drkzg\") pod \"cert-manager-webhook-687f57d79b-vf65m\" (UID: \"09203846-9e2d-4748-b11f-c64b5a9c9c85\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vf65m" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.465799 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jwt5\" (UniqueName: \"kubernetes.io/projected/fb8bea11-37f9-43cf-9a3c-07e54ebca5fa-kube-api-access-5jwt5\") pod \"cert-manager-858654f9db-ncl69\" (UID: \"fb8bea11-37f9-43cf-9a3c-07e54ebca5fa\") " pod="cert-manager/cert-manager-858654f9db-ncl69" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.466616 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrgdb\" (UniqueName: \"kubernetes.io/projected/a8125ed7-e435-4a7e-8b09-541af1b40820-kube-api-access-jrgdb\") pod \"cert-manager-cainjector-cf98fcc89-k4cwb\" (UID: \"a8125ed7-e435-4a7e-8b09-541af1b40820\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-k4cwb" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.594007 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-k4cwb" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.615443 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-ncl69" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.638756 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vf65m" Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.863206 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-k4cwb"] Mar 17 11:25:03 crc kubenswrapper[4742]: I0317 11:25:03.955945 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-k4cwb" event={"ID":"a8125ed7-e435-4a7e-8b09-541af1b40820","Type":"ContainerStarted","Data":"79140e385f5caf965055fabdb201fc6c2e7511ded8725e72c17da6e32ffaad59"} Mar 17 11:25:04 crc kubenswrapper[4742]: I0317 11:25:04.124884 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vf65m"] Mar 17 11:25:04 crc kubenswrapper[4742]: W0317 11:25:04.129205 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09203846_9e2d_4748_b11f_c64b5a9c9c85.slice/crio-f0553067dea49a4cdc2c70c3ff6e3cdb90c9e5e2c5b8b92a0dee7f1f1b45d89e WatchSource:0}: Error finding container f0553067dea49a4cdc2c70c3ff6e3cdb90c9e5e2c5b8b92a0dee7f1f1b45d89e: Status 404 returned error can't find the container with id f0553067dea49a4cdc2c70c3ff6e3cdb90c9e5e2c5b8b92a0dee7f1f1b45d89e Mar 17 11:25:04 crc kubenswrapper[4742]: I0317 11:25:04.149176 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-ncl69"] Mar 17 11:25:04 crc kubenswrapper[4742]: W0317 11:25:04.160338 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb8bea11_37f9_43cf_9a3c_07e54ebca5fa.slice/crio-e1c13dca1b8eac23241e19b9dc2a96cd3b95a1df2d498a009f272819ac1263dc WatchSource:0}: Error finding container e1c13dca1b8eac23241e19b9dc2a96cd3b95a1df2d498a009f272819ac1263dc: Status 404 returned error can't find the container with id e1c13dca1b8eac23241e19b9dc2a96cd3b95a1df2d498a009f272819ac1263dc Mar 17 11:25:04 crc kubenswrapper[4742]: I0317 11:25:04.975600 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ncl69" event={"ID":"fb8bea11-37f9-43cf-9a3c-07e54ebca5fa","Type":"ContainerStarted","Data":"e1c13dca1b8eac23241e19b9dc2a96cd3b95a1df2d498a009f272819ac1263dc"} Mar 17 11:25:04 crc kubenswrapper[4742]: I0317 11:25:04.981835 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vf65m" event={"ID":"09203846-9e2d-4748-b11f-c64b5a9c9c85","Type":"ContainerStarted","Data":"f0553067dea49a4cdc2c70c3ff6e3cdb90c9e5e2c5b8b92a0dee7f1f1b45d89e"} Mar 17 11:25:06 crc kubenswrapper[4742]: I0317 11:25:06.993964 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-k4cwb" event={"ID":"a8125ed7-e435-4a7e-8b09-541af1b40820","Type":"ContainerStarted","Data":"f8b524ee8c595883506300526bdb1cf435372aaf660999e6e78d06fc53d31a7f"} Mar 17 11:25:07 crc kubenswrapper[4742]: I0317 11:25:07.013500 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-k4cwb" podStartSLOduration=1.968531122 podStartE2EDuration="4.013452322s" podCreationTimestamp="2026-03-17 11:25:03 +0000 UTC" firstStartedPulling="2026-03-17 11:25:03.878355562 +0000 UTC m=+807.004483320" lastFinishedPulling="2026-03-17 11:25:05.923276732 +0000 UTC m=+809.049404520" observedRunningTime="2026-03-17 11:25:07.009390217 +0000 UTC m=+810.135518005" watchObservedRunningTime="2026-03-17 11:25:07.013452322 +0000 UTC m=+810.139580100" Mar 17 11:25:08 crc kubenswrapper[4742]: I0317 11:25:08.000705 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-ncl69" event={"ID":"fb8bea11-37f9-43cf-9a3c-07e54ebca5fa","Type":"ContainerStarted","Data":"480eb33032f068352e19e43e0a8774a5462e144359b764ce55c85e3c35263b67"} Mar 17 11:25:08 crc kubenswrapper[4742]: I0317 11:25:08.004973 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vf65m" event={"ID":"09203846-9e2d-4748-b11f-c64b5a9c9c85","Type":"ContainerStarted","Data":"8a09e9b41972e78acc1eac72e3b2df05e350075625f5787f93ab9719c6cd1056"} Mar 17 11:25:08 crc kubenswrapper[4742]: I0317 11:25:08.005288 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-vf65m" Mar 17 11:25:08 crc kubenswrapper[4742]: I0317 11:25:08.019801 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-ncl69" podStartSLOduration=1.558555291 podStartE2EDuration="5.019756624s" podCreationTimestamp="2026-03-17 11:25:03 +0000 UTC" firstStartedPulling="2026-03-17 11:25:04.163126673 +0000 UTC m=+807.289254471" lastFinishedPulling="2026-03-17 11:25:07.624328026 +0000 UTC m=+810.750455804" observedRunningTime="2026-03-17 11:25:08.018126527 +0000 UTC m=+811.144254335" watchObservedRunningTime="2026-03-17 11:25:08.019756624 +0000 UTC m=+811.145884402" Mar 17 11:25:08 crc kubenswrapper[4742]: I0317 11:25:08.049451 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-vf65m" podStartSLOduration=1.5633665570000002 podStartE2EDuration="5.049427705s" podCreationTimestamp="2026-03-17 11:25:03 +0000 UTC" firstStartedPulling="2026-03-17 11:25:04.131311031 +0000 UTC m=+807.257438789" lastFinishedPulling="2026-03-17 11:25:07.617372169 +0000 UTC m=+810.743499937" observedRunningTime="2026-03-17 11:25:08.041969413 +0000 UTC m=+811.168097181" watchObservedRunningTime="2026-03-17 11:25:08.049427705 +0000 UTC m=+811.175555503" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.288699 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zwfsr"] Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.289608 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovn-controller" containerID="cri-o://c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545" gracePeriod=30 Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.289732 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="nbdb" containerID="cri-o://e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0" gracePeriod=30 Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.289777 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397" gracePeriod=30 Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.289838 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="kube-rbac-proxy-node" containerID="cri-o://0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33" gracePeriod=30 Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.289896 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovn-acl-logging" containerID="cri-o://b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2" gracePeriod=30 Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.289755 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="northd" containerID="cri-o://f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425" gracePeriod=30 Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.290021 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="sbdb" containerID="cri-o://ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65" gracePeriod=30 Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.343273 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" containerID="cri-o://a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a" gracePeriod=30 Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.643359 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-vf65m" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.643394 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/3.log" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.647416 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovn-acl-logging/0.log" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.648031 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovn-controller/0.log" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.648482 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.740509 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5gwls"] Mar 17 11:25:13 crc kubenswrapper[4742]: E0317 11:25:13.741110 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="kubecfg-setup" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741142 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="kubecfg-setup" Mar 17 11:25:13 crc kubenswrapper[4742]: E0317 11:25:13.741167 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="kube-rbac-proxy-ovn-metrics" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741181 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="kube-rbac-proxy-ovn-metrics" Mar 17 11:25:13 crc kubenswrapper[4742]: E0317 11:25:13.741198 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="nbdb" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741210 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="nbdb" Mar 17 11:25:13 crc kubenswrapper[4742]: E0317 11:25:13.741227 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovn-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741240 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovn-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: E0317 11:25:13.741261 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741275 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: E0317 11:25:13.741295 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741307 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: E0317 11:25:13.741328 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="sbdb" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741340 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="sbdb" Mar 17 11:25:13 crc kubenswrapper[4742]: E0317 11:25:13.741359 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741377 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: E0317 11:25:13.741393 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovn-acl-logging" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741405 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovn-acl-logging" Mar 17 11:25:13 crc kubenswrapper[4742]: E0317 11:25:13.741432 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="northd" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741445 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="northd" Mar 17 11:25:13 crc kubenswrapper[4742]: E0317 11:25:13.741469 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741482 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: E0317 11:25:13.741500 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="kube-rbac-proxy-node" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741512 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="kube-rbac-proxy-node" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741843 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741875 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovn-acl-logging" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741893 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741940 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="kube-rbac-proxy-ovn-metrics" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741965 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovn-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.741990 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="northd" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.742006 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.742021 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="nbdb" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.742044 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="kube-rbac-proxy-node" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.742071 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="sbdb" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.742094 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: E0317 11:25:13.742435 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.742461 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.742770 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerName="ovnkube-controller" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.749015 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.791460 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkjp8\" (UniqueName: \"kubernetes.io/projected/d021cdee-f700-4a5f-a62e-be4acbb8c62e-kube-api-access-qkjp8\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.791512 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-cni-netd\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.791538 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-etc-openvswitch\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.791561 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.791580 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-var-lib-openvswitch\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.791620 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-kubelet\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.791639 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-openvswitch\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.791690 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.791960 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.791990 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792008 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792025 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792066 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792195 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovnkube-script-lib\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792243 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-systemd-units\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792266 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-cni-bin\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792310 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-node-log\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792351 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-ovn\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792387 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-run-netns\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792413 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-systemd\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792436 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-slash\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792473 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-run-ovn-kubernetes\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792493 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovn-node-metrics-cert\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792509 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-log-socket\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792547 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovnkube-config\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792566 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-env-overrides\") pod \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\" (UID: \"d021cdee-f700-4a5f-a62e-be4acbb8c62e\") " Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792670 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-run-ovn\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792729 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792756 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-slash\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792805 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-node-log\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792835 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-systemd-units\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792876 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-var-lib-openvswitch\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792931 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-kubelet\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792960 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-run-ovn-kubernetes\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792985 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0978bd5c-49d8-4120-8052-69d29ecea82b-ovnkube-config\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793030 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0978bd5c-49d8-4120-8052-69d29ecea82b-ovn-node-metrics-cert\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793059 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-run-netns\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792670 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793107 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-cni-netd\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792701 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-node-log" (OuterVolumeSpecName: "node-log") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792723 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792741 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.792767 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-slash" (OuterVolumeSpecName: "host-slash") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793058 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793197 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793009 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793082 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-log-socket" (OuterVolumeSpecName: "log-socket") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793259 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793286 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0978bd5c-49d8-4120-8052-69d29ecea82b-ovnkube-script-lib\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793354 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-etc-openvswitch\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793354 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793438 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjk6k\" (UniqueName: \"kubernetes.io/projected/0978bd5c-49d8-4120-8052-69d29ecea82b-kube-api-access-cjk6k\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793461 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-run-openvswitch\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793508 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0978bd5c-49d8-4120-8052-69d29ecea82b-env-overrides\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793532 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-log-socket\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793558 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-cni-bin\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793608 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-run-systemd\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793686 4742 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-slash\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793706 4742 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793738 4742 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-log-socket\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793749 4742 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793759 4742 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793768 4742 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793778 4742 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793787 4742 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793797 4742 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793824 4742 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793833 4742 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793843 4742 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793853 4742 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793862 4742 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793871 4742 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-node-log\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793894 4742 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.793918 4742 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.802436 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.805325 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d021cdee-f700-4a5f-a62e-be4acbb8c62e-kube-api-access-qkjp8" (OuterVolumeSpecName: "kube-api-access-qkjp8") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "kube-api-access-qkjp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.817373 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d021cdee-f700-4a5f-a62e-be4acbb8c62e" (UID: "d021cdee-f700-4a5f-a62e-be4acbb8c62e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895395 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-run-netns\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895456 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-cni-netd\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895488 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0978bd5c-49d8-4120-8052-69d29ecea82b-ovnkube-script-lib\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895517 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-etc-openvswitch\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895558 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjk6k\" (UniqueName: \"kubernetes.io/projected/0978bd5c-49d8-4120-8052-69d29ecea82b-kube-api-access-cjk6k\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895560 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-run-netns\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895627 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-cni-netd\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895678 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-etc-openvswitch\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895639 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-run-openvswitch\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895588 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-run-openvswitch\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895785 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0978bd5c-49d8-4120-8052-69d29ecea82b-env-overrides\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895830 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-log-socket\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895873 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-cni-bin\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.895965 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-run-systemd\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896054 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-run-ovn\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896093 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896142 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-slash\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896188 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-node-log\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896236 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-systemd-units\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896264 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-var-lib-openvswitch\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896307 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-kubelet\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896351 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-run-ovn-kubernetes\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896388 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0978bd5c-49d8-4120-8052-69d29ecea82b-ovnkube-config\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896422 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0978bd5c-49d8-4120-8052-69d29ecea82b-ovn-node-metrics-cert\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896441 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0978bd5c-49d8-4120-8052-69d29ecea82b-env-overrides\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896512 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-kubelet\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896536 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-node-log\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896537 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896555 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-systemd-units\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896571 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-var-lib-openvswitch\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896596 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-log-socket\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896611 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-run-ovn-kubernetes\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896682 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-cni-bin\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896710 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-run-systemd\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896734 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0978bd5c-49d8-4120-8052-69d29ecea82b-ovnkube-script-lib\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896739 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-run-ovn\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896807 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0978bd5c-49d8-4120-8052-69d29ecea82b-host-slash\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896899 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkjp8\" (UniqueName: \"kubernetes.io/projected/d021cdee-f700-4a5f-a62e-be4acbb8c62e-kube-api-access-qkjp8\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896948 4742 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d021cdee-f700-4a5f-a62e-be4acbb8c62e-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.896964 4742 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d021cdee-f700-4a5f-a62e-be4acbb8c62e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.897012 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0978bd5c-49d8-4120-8052-69d29ecea82b-ovnkube-config\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.900510 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0978bd5c-49d8-4120-8052-69d29ecea82b-ovn-node-metrics-cert\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:13 crc kubenswrapper[4742]: I0317 11:25:13.924466 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjk6k\" (UniqueName: \"kubernetes.io/projected/0978bd5c-49d8-4120-8052-69d29ecea82b-kube-api-access-cjk6k\") pod \"ovnkube-node-5gwls\" (UID: \"0978bd5c-49d8-4120-8052-69d29ecea82b\") " pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.049864 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovnkube-controller/3.log" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.053462 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovn-acl-logging/0.log" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.054676 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwfsr_d021cdee-f700-4a5f-a62e-be4acbb8c62e/ovn-controller/0.log" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.055109 4742 generic.go:334] "Generic (PLEG): container finished" podID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerID="a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a" exitCode=0 Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.055315 4742 generic.go:334] "Generic (PLEG): container finished" podID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerID="ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65" exitCode=0 Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.055455 4742 generic.go:334] "Generic (PLEG): container finished" podID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerID="e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0" exitCode=0 Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.055572 4742 generic.go:334] "Generic (PLEG): container finished" podID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerID="f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425" exitCode=0 Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.055704 4742 generic.go:334] "Generic (PLEG): container finished" podID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerID="f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397" exitCode=0 Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.055833 4742 generic.go:334] "Generic (PLEG): container finished" podID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerID="0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33" exitCode=0 Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.055985 4742 generic.go:334] "Generic (PLEG): container finished" podID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerID="b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2" exitCode=143 Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.056117 4742 generic.go:334] "Generic (PLEG): container finished" podID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" containerID="c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545" exitCode=143 Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.055166 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.056470 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.056699 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.056848 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.057007 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.057194 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.057837 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.058272 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.058430 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.058591 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.058822 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.059074 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.059215 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.059401 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.059601 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.059851 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.055409 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.058628 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwmfc_ff1068ee-5ebe-4575-806d-967a3b9bfb6a/kube-multus/2.log" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.056541 4742 scope.go:117] "RemoveContainer" containerID="a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060086 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060546 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060561 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060566 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060572 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060577 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060582 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060587 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060603 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060615 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060628 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060647 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060654 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060661 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060667 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060672 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060677 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060682 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060688 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060693 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060698 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060705 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwfsr" event={"ID":"d021cdee-f700-4a5f-a62e-be4acbb8c62e","Type":"ContainerDied","Data":"61da21cdaeb0ecf937ece364594b3e839720124913fb881b02094fdd7fe63e87"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060713 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060719 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060724 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060729 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060734 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060740 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060746 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060751 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060757 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.060763 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.064118 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwmfc_ff1068ee-5ebe-4575-806d-967a3b9bfb6a/kube-multus/1.log" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.064316 4742 generic.go:334] "Generic (PLEG): container finished" podID="ff1068ee-5ebe-4575-806d-967a3b9bfb6a" containerID="49f006810bcc95db05a54979c00d1df941ae6ad018abc40980080ba41668f2fa" exitCode=2 Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.064388 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwmfc" event={"ID":"ff1068ee-5ebe-4575-806d-967a3b9bfb6a","Type":"ContainerDied","Data":"49f006810bcc95db05a54979c00d1df941ae6ad018abc40980080ba41668f2fa"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.064611 4742 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a7dfbf3da964f99f958fe0751c5fdfaf6d1c1d5938316d5fa840c4187b524fe"} Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.065160 4742 scope.go:117] "RemoveContainer" containerID="49f006810bcc95db05a54979c00d1df941ae6ad018abc40980080ba41668f2fa" Mar 17 11:25:14 crc kubenswrapper[4742]: E0317 11:25:14.065337 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xwmfc_openshift-multus(ff1068ee-5ebe-4575-806d-967a3b9bfb6a)\"" pod="openshift-multus/multus-xwmfc" podUID="ff1068ee-5ebe-4575-806d-967a3b9bfb6a" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.071485 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.100719 4742 scope.go:117] "RemoveContainer" containerID="80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.129715 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zwfsr"] Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.130090 4742 scope.go:117] "RemoveContainer" containerID="ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.135893 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zwfsr"] Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.151519 4742 scope.go:117] "RemoveContainer" containerID="e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.171099 4742 scope.go:117] "RemoveContainer" containerID="f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.199542 4742 scope.go:117] "RemoveContainer" containerID="f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.226759 4742 scope.go:117] "RemoveContainer" containerID="0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.240612 4742 scope.go:117] "RemoveContainer" containerID="b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.261602 4742 scope.go:117] "RemoveContainer" containerID="c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.336123 4742 scope.go:117] "RemoveContainer" containerID="8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.354312 4742 scope.go:117] "RemoveContainer" containerID="a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a" Mar 17 11:25:14 crc kubenswrapper[4742]: E0317 11:25:14.354853 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a\": container with ID starting with a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a not found: ID does not exist" containerID="a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.354950 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a"} err="failed to get container status \"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a\": rpc error: code = NotFound desc = could not find container \"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a\": container with ID starting with a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.354999 4742 scope.go:117] "RemoveContainer" containerID="80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2" Mar 17 11:25:14 crc kubenswrapper[4742]: E0317 11:25:14.355316 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\": container with ID starting with 80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2 not found: ID does not exist" containerID="80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.355359 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2"} err="failed to get container status \"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\": rpc error: code = NotFound desc = could not find container \"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\": container with ID starting with 80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.355389 4742 scope.go:117] "RemoveContainer" containerID="ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65" Mar 17 11:25:14 crc kubenswrapper[4742]: E0317 11:25:14.356314 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\": container with ID starting with ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65 not found: ID does not exist" containerID="ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.356379 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65"} err="failed to get container status \"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\": rpc error: code = NotFound desc = could not find container \"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\": container with ID starting with ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.356419 4742 scope.go:117] "RemoveContainer" containerID="e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0" Mar 17 11:25:14 crc kubenswrapper[4742]: E0317 11:25:14.356864 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\": container with ID starting with e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0 not found: ID does not exist" containerID="e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.356949 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0"} err="failed to get container status \"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\": rpc error: code = NotFound desc = could not find container \"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\": container with ID starting with e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.356988 4742 scope.go:117] "RemoveContainer" containerID="f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425" Mar 17 11:25:14 crc kubenswrapper[4742]: E0317 11:25:14.357428 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\": container with ID starting with f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425 not found: ID does not exist" containerID="f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.357482 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425"} err="failed to get container status \"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\": rpc error: code = NotFound desc = could not find container \"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\": container with ID starting with f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.357529 4742 scope.go:117] "RemoveContainer" containerID="f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397" Mar 17 11:25:14 crc kubenswrapper[4742]: E0317 11:25:14.358148 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\": container with ID starting with f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397 not found: ID does not exist" containerID="f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.358206 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397"} err="failed to get container status \"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\": rpc error: code = NotFound desc = could not find container \"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\": container with ID starting with f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.358241 4742 scope.go:117] "RemoveContainer" containerID="0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33" Mar 17 11:25:14 crc kubenswrapper[4742]: E0317 11:25:14.359046 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\": container with ID starting with 0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33 not found: ID does not exist" containerID="0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.359143 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33"} err="failed to get container status \"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\": rpc error: code = NotFound desc = could not find container \"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\": container with ID starting with 0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.359204 4742 scope.go:117] "RemoveContainer" containerID="b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2" Mar 17 11:25:14 crc kubenswrapper[4742]: E0317 11:25:14.359807 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\": container with ID starting with b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2 not found: ID does not exist" containerID="b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.359873 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2"} err="failed to get container status \"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\": rpc error: code = NotFound desc = could not find container \"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\": container with ID starting with b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.359900 4742 scope.go:117] "RemoveContainer" containerID="c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545" Mar 17 11:25:14 crc kubenswrapper[4742]: E0317 11:25:14.360737 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\": container with ID starting with c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545 not found: ID does not exist" containerID="c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.360780 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545"} err="failed to get container status \"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\": rpc error: code = NotFound desc = could not find container \"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\": container with ID starting with c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.360801 4742 scope.go:117] "RemoveContainer" containerID="8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38" Mar 17 11:25:14 crc kubenswrapper[4742]: E0317 11:25:14.361124 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\": container with ID starting with 8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38 not found: ID does not exist" containerID="8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.361154 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38"} err="failed to get container status \"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\": rpc error: code = NotFound desc = could not find container \"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\": container with ID starting with 8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.361173 4742 scope.go:117] "RemoveContainer" containerID="a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.361441 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a"} err="failed to get container status \"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a\": rpc error: code = NotFound desc = could not find container \"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a\": container with ID starting with a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.361470 4742 scope.go:117] "RemoveContainer" containerID="80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.361813 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2"} err="failed to get container status \"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\": rpc error: code = NotFound desc = could not find container \"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\": container with ID starting with 80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.361850 4742 scope.go:117] "RemoveContainer" containerID="ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.362260 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65"} err="failed to get container status \"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\": rpc error: code = NotFound desc = could not find container \"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\": container with ID starting with ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.362293 4742 scope.go:117] "RemoveContainer" containerID="e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.362559 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0"} err="failed to get container status \"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\": rpc error: code = NotFound desc = could not find container \"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\": container with ID starting with e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.362592 4742 scope.go:117] "RemoveContainer" containerID="f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.363041 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425"} err="failed to get container status \"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\": rpc error: code = NotFound desc = could not find container \"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\": container with ID starting with f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.363076 4742 scope.go:117] "RemoveContainer" containerID="f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.363353 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397"} err="failed to get container status \"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\": rpc error: code = NotFound desc = could not find container \"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\": container with ID starting with f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.363386 4742 scope.go:117] "RemoveContainer" containerID="0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.363820 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33"} err="failed to get container status \"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\": rpc error: code = NotFound desc = could not find container \"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\": container with ID starting with 0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.363856 4742 scope.go:117] "RemoveContainer" containerID="b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.364134 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2"} err="failed to get container status \"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\": rpc error: code = NotFound desc = could not find container \"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\": container with ID starting with b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.364170 4742 scope.go:117] "RemoveContainer" containerID="c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.364461 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545"} err="failed to get container status \"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\": rpc error: code = NotFound desc = could not find container \"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\": container with ID starting with c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.364497 4742 scope.go:117] "RemoveContainer" containerID="8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.364804 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38"} err="failed to get container status \"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\": rpc error: code = NotFound desc = could not find container \"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\": container with ID starting with 8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.364839 4742 scope.go:117] "RemoveContainer" containerID="a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.365331 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a"} err="failed to get container status \"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a\": rpc error: code = NotFound desc = could not find container \"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a\": container with ID starting with a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.365367 4742 scope.go:117] "RemoveContainer" containerID="80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.365616 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2"} err="failed to get container status \"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\": rpc error: code = NotFound desc = could not find container \"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\": container with ID starting with 80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.365655 4742 scope.go:117] "RemoveContainer" containerID="ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.365888 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65"} err="failed to get container status \"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\": rpc error: code = NotFound desc = could not find container \"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\": container with ID starting with ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.365976 4742 scope.go:117] "RemoveContainer" containerID="e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.366346 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0"} err="failed to get container status \"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\": rpc error: code = NotFound desc = could not find container \"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\": container with ID starting with e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.366379 4742 scope.go:117] "RemoveContainer" containerID="f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.366615 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425"} err="failed to get container status \"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\": rpc error: code = NotFound desc = could not find container \"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\": container with ID starting with f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.366638 4742 scope.go:117] "RemoveContainer" containerID="f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.366829 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397"} err="failed to get container status \"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\": rpc error: code = NotFound desc = could not find container \"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\": container with ID starting with f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.366852 4742 scope.go:117] "RemoveContainer" containerID="0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.367081 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33"} err="failed to get container status \"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\": rpc error: code = NotFound desc = could not find container \"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\": container with ID starting with 0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.367112 4742 scope.go:117] "RemoveContainer" containerID="b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.367353 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2"} err="failed to get container status \"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\": rpc error: code = NotFound desc = could not find container \"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\": container with ID starting with b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.367382 4742 scope.go:117] "RemoveContainer" containerID="c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.367721 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545"} err="failed to get container status \"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\": rpc error: code = NotFound desc = could not find container \"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\": container with ID starting with c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.367754 4742 scope.go:117] "RemoveContainer" containerID="8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.368028 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38"} err="failed to get container status \"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\": rpc error: code = NotFound desc = could not find container \"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\": container with ID starting with 8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.368057 4742 scope.go:117] "RemoveContainer" containerID="a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.368273 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a"} err="failed to get container status \"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a\": rpc error: code = NotFound desc = could not find container \"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a\": container with ID starting with a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.368302 4742 scope.go:117] "RemoveContainer" containerID="80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.368516 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2"} err="failed to get container status \"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\": rpc error: code = NotFound desc = could not find container \"80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2\": container with ID starting with 80d5e68dde8810766aa9f233245165794f1f709e777a1aa7d451f6747b4cf1c2 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.368545 4742 scope.go:117] "RemoveContainer" containerID="ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.368737 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65"} err="failed to get container status \"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\": rpc error: code = NotFound desc = could not find container \"ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65\": container with ID starting with ff73305f46cbb74a6f23b3ad503af956b36a43c9e18445b8ae843defa2ae7e65 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.368766 4742 scope.go:117] "RemoveContainer" containerID="e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.369039 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0"} err="failed to get container status \"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\": rpc error: code = NotFound desc = could not find container \"e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0\": container with ID starting with e57d5d1183e6601e9da04aee1029a40e179853a69d9b603e45051d9836c8f8b0 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.369065 4742 scope.go:117] "RemoveContainer" containerID="f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.369438 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425"} err="failed to get container status \"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\": rpc error: code = NotFound desc = could not find container \"f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425\": container with ID starting with f83f67f75271071ad929b41f89728a355bbec8a7b01156238428a1f0762db425 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.369461 4742 scope.go:117] "RemoveContainer" containerID="f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.369834 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397"} err="failed to get container status \"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\": rpc error: code = NotFound desc = could not find container \"f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397\": container with ID starting with f79dce574f3dbc1787ab9d25585a6b0e83ca6f9f5ae0ca9b308cd8944abf7397 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.369858 4742 scope.go:117] "RemoveContainer" containerID="0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.370121 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33"} err="failed to get container status \"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\": rpc error: code = NotFound desc = could not find container \"0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33\": container with ID starting with 0028a00fda6061ad8a16ef8bbfe7a7b6d3dedb656846c86515d194e81cd17e33 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.370153 4742 scope.go:117] "RemoveContainer" containerID="b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.370495 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2"} err="failed to get container status \"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\": rpc error: code = NotFound desc = could not find container \"b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2\": container with ID starting with b3bef759553bf249e46625b7e807a6c68d102a30554b5fb0fd65d1883a49a5a2 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.370518 4742 scope.go:117] "RemoveContainer" containerID="c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.370736 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545"} err="failed to get container status \"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\": rpc error: code = NotFound desc = could not find container \"c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545\": container with ID starting with c8b38d128a7d13874ecdf8d401fa2fc35627e5165bb1e3aae292d7d6f7676545 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.370759 4742 scope.go:117] "RemoveContainer" containerID="8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.371218 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38"} err="failed to get container status \"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\": rpc error: code = NotFound desc = could not find container \"8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38\": container with ID starting with 8512df55a32ab8b6eb2e4d20fc5baaa77f4477bdf3dcfc218e0590f6cc68bc38 not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.371247 4742 scope.go:117] "RemoveContainer" containerID="a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.371514 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a"} err="failed to get container status \"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a\": rpc error: code = NotFound desc = could not find container \"a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a\": container with ID starting with a7d34c4f48eebd433646b361ffb96db7701cd0ee0241374d767039770c5f671a not found: ID does not exist" Mar 17 11:25:14 crc kubenswrapper[4742]: I0317 11:25:14.677251 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d021cdee-f700-4a5f-a62e-be4acbb8c62e" path="/var/lib/kubelet/pods/d021cdee-f700-4a5f-a62e-be4acbb8c62e/volumes" Mar 17 11:25:15 crc kubenswrapper[4742]: I0317 11:25:15.074947 4742 generic.go:334] "Generic (PLEG): container finished" podID="0978bd5c-49d8-4120-8052-69d29ecea82b" containerID="6318bf4f2a71bad834e26927e01b4fdf1f619c145fabadc9b7e434c690cbab35" exitCode=0 Mar 17 11:25:15 crc kubenswrapper[4742]: I0317 11:25:15.075063 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" event={"ID":"0978bd5c-49d8-4120-8052-69d29ecea82b","Type":"ContainerDied","Data":"6318bf4f2a71bad834e26927e01b4fdf1f619c145fabadc9b7e434c690cbab35"} Mar 17 11:25:15 crc kubenswrapper[4742]: I0317 11:25:15.075139 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" event={"ID":"0978bd5c-49d8-4120-8052-69d29ecea82b","Type":"ContainerStarted","Data":"72a7a0aef8a69347a6acc467fb189250c3a21b080fc01954b4973c1a7931adf5"} Mar 17 11:25:16 crc kubenswrapper[4742]: I0317 11:25:16.093522 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" event={"ID":"0978bd5c-49d8-4120-8052-69d29ecea82b","Type":"ContainerStarted","Data":"969e0cb1aed9baab5d4d0cf14e31357580dcccc80c9c91ed4f7fd59f35af986b"} Mar 17 11:25:16 crc kubenswrapper[4742]: I0317 11:25:16.093762 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" event={"ID":"0978bd5c-49d8-4120-8052-69d29ecea82b","Type":"ContainerStarted","Data":"a8d4dcb6b6f786eedae6b4f809b8238ee81f2d7d103e39004151047272ac20cf"} Mar 17 11:25:16 crc kubenswrapper[4742]: I0317 11:25:16.093775 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" event={"ID":"0978bd5c-49d8-4120-8052-69d29ecea82b","Type":"ContainerStarted","Data":"9cd3b37169835551a531b52f098ed571d6d0b382b139d7236c804c13bdb03eaf"} Mar 17 11:25:16 crc kubenswrapper[4742]: I0317 11:25:16.093787 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" event={"ID":"0978bd5c-49d8-4120-8052-69d29ecea82b","Type":"ContainerStarted","Data":"dbdbfae44f726bd038ea804a39c48adbcf7c533209e4051fd08864e1bc1f1c08"} Mar 17 11:25:16 crc kubenswrapper[4742]: I0317 11:25:16.093796 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" event={"ID":"0978bd5c-49d8-4120-8052-69d29ecea82b","Type":"ContainerStarted","Data":"43d66449694e776f5ab121826170ada4c62ca1a9b4e07102266d0bbf27b552cd"} Mar 17 11:25:17 crc kubenswrapper[4742]: I0317 11:25:17.106792 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" event={"ID":"0978bd5c-49d8-4120-8052-69d29ecea82b","Type":"ContainerStarted","Data":"801fb9c42980a03d7043000b3f840229deac73c183267d82eba31b2582600b62"} Mar 17 11:25:18 crc kubenswrapper[4742]: I0317 11:25:18.044304 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:25:18 crc kubenswrapper[4742]: I0317 11:25:18.044645 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:25:19 crc kubenswrapper[4742]: I0317 11:25:19.126792 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" event={"ID":"0978bd5c-49d8-4120-8052-69d29ecea82b","Type":"ContainerStarted","Data":"c4193ad349c69307c12e8fca6f03fdaad23ef39a565676e7ad2d51a36aa4b2b5"} Mar 17 11:25:21 crc kubenswrapper[4742]: I0317 11:25:21.141289 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" event={"ID":"0978bd5c-49d8-4120-8052-69d29ecea82b","Type":"ContainerStarted","Data":"1260af9e40f8895c0ebea486abc2fd2f1fa924024edc3d03eaf850ce939caf6a"} Mar 17 11:25:21 crc kubenswrapper[4742]: I0317 11:25:21.141736 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:21 crc kubenswrapper[4742]: I0317 11:25:21.142213 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:21 crc kubenswrapper[4742]: I0317 11:25:21.177728 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:21 crc kubenswrapper[4742]: I0317 11:25:21.182530 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" podStartSLOduration=8.182512002 podStartE2EDuration="8.182512002s" podCreationTimestamp="2026-03-17 11:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:25:21.179581099 +0000 UTC m=+824.305708867" watchObservedRunningTime="2026-03-17 11:25:21.182512002 +0000 UTC m=+824.308639770" Mar 17 11:25:21 crc kubenswrapper[4742]: I0317 11:25:21.190589 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:22 crc kubenswrapper[4742]: I0317 11:25:22.149011 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:27 crc kubenswrapper[4742]: I0317 11:25:27.663234 4742 scope.go:117] "RemoveContainer" containerID="49f006810bcc95db05a54979c00d1df941ae6ad018abc40980080ba41668f2fa" Mar 17 11:25:27 crc kubenswrapper[4742]: E0317 11:25:27.664138 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xwmfc_openshift-multus(ff1068ee-5ebe-4575-806d-967a3b9bfb6a)\"" pod="openshift-multus/multus-xwmfc" podUID="ff1068ee-5ebe-4575-806d-967a3b9bfb6a" Mar 17 11:25:39 crc kubenswrapper[4742]: I0317 11:25:39.904156 4742 scope.go:117] "RemoveContainer" containerID="1a7dfbf3da964f99f958fe0751c5fdfaf6d1c1d5938316d5fa840c4187b524fe" Mar 17 11:25:40 crc kubenswrapper[4742]: I0317 11:25:40.310050 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwmfc_ff1068ee-5ebe-4575-806d-967a3b9bfb6a/kube-multus/2.log" Mar 17 11:25:41 crc kubenswrapper[4742]: I0317 11:25:41.663155 4742 scope.go:117] "RemoveContainer" containerID="49f006810bcc95db05a54979c00d1df941ae6ad018abc40980080ba41668f2fa" Mar 17 11:25:42 crc kubenswrapper[4742]: I0317 11:25:42.328394 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xwmfc_ff1068ee-5ebe-4575-806d-967a3b9bfb6a/kube-multus/2.log" Mar 17 11:25:42 crc kubenswrapper[4742]: I0317 11:25:42.328849 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xwmfc" event={"ID":"ff1068ee-5ebe-4575-806d-967a3b9bfb6a","Type":"ContainerStarted","Data":"85d3d3b3e52361266f7f8c38c772b9f21e049856748c0d643593f8a7aae11e2b"} Mar 17 11:25:44 crc kubenswrapper[4742]: I0317 11:25:44.108847 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5gwls" Mar 17 11:25:48 crc kubenswrapper[4742]: I0317 11:25:48.044424 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:25:48 crc kubenswrapper[4742]: I0317 11:25:48.045232 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:25:48 crc kubenswrapper[4742]: I0317 11:25:48.045313 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:25:48 crc kubenswrapper[4742]: I0317 11:25:48.046181 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b6d37342f3ee85fc8b1ee717e3e6b6ff2837c9e6e923cd75738d5afd1b0bd6d"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 11:25:48 crc kubenswrapper[4742]: I0317 11:25:48.046282 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://0b6d37342f3ee85fc8b1ee717e3e6b6ff2837c9e6e923cd75738d5afd1b0bd6d" gracePeriod=600 Mar 17 11:25:48 crc kubenswrapper[4742]: E0317 11:25:48.892230 4742 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e11ad39_38bb_4b70_9cac_ce078b37f882.slice/crio-conmon-0b6d37342f3ee85fc8b1ee717e3e6b6ff2837c9e6e923cd75738d5afd1b0bd6d.scope\": RecentStats: unable to find data in memory cache]" Mar 17 11:25:49 crc kubenswrapper[4742]: I0317 11:25:49.381018 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="0b6d37342f3ee85fc8b1ee717e3e6b6ff2837c9e6e923cd75738d5afd1b0bd6d" exitCode=0 Mar 17 11:25:49 crc kubenswrapper[4742]: I0317 11:25:49.381100 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"0b6d37342f3ee85fc8b1ee717e3e6b6ff2837c9e6e923cd75738d5afd1b0bd6d"} Mar 17 11:25:49 crc kubenswrapper[4742]: I0317 11:25:49.381422 4742 scope.go:117] "RemoveContainer" containerID="c79ebe7c8568e968315882c89bb4596b0d3698f3b81bf528102fe1275360f30f" Mar 17 11:25:50 crc kubenswrapper[4742]: I0317 11:25:50.391527 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"e970ab8ae9b7236a8af0e70d950c97f70be620ea87e4acbc181c30424216e493"} Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.406022 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr"] Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.407411 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.409434 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.418338 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr"] Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.499879 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm6xl\" (UniqueName: \"kubernetes.io/projected/20e57e18-cc27-4d2e-9207-e784beb4ce2f-kube-api-access-pm6xl\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr\" (UID: \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.500000 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e57e18-cc27-4d2e-9207-e784beb4ce2f-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr\" (UID: \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.500047 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e57e18-cc27-4d2e-9207-e784beb4ce2f-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr\" (UID: \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.601153 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm6xl\" (UniqueName: \"kubernetes.io/projected/20e57e18-cc27-4d2e-9207-e784beb4ce2f-kube-api-access-pm6xl\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr\" (UID: \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.601249 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e57e18-cc27-4d2e-9207-e784beb4ce2f-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr\" (UID: \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.601310 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e57e18-cc27-4d2e-9207-e784beb4ce2f-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr\" (UID: \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.601761 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e57e18-cc27-4d2e-9207-e784beb4ce2f-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr\" (UID: \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.601877 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e57e18-cc27-4d2e-9207-e784beb4ce2f-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr\" (UID: \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.639943 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm6xl\" (UniqueName: \"kubernetes.io/projected/20e57e18-cc27-4d2e-9207-e784beb4ce2f-kube-api-access-pm6xl\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr\" (UID: \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" Mar 17 11:25:53 crc kubenswrapper[4742]: I0317 11:25:53.733168 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" Mar 17 11:25:54 crc kubenswrapper[4742]: I0317 11:25:54.008270 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr"] Mar 17 11:25:54 crc kubenswrapper[4742]: I0317 11:25:54.422578 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" event={"ID":"20e57e18-cc27-4d2e-9207-e784beb4ce2f","Type":"ContainerStarted","Data":"396c7c2bde2b7f397641b935c0c3180ad9c5d5694e7e7b7692be75e70f92e544"} Mar 17 11:25:54 crc kubenswrapper[4742]: I0317 11:25:54.422864 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" event={"ID":"20e57e18-cc27-4d2e-9207-e784beb4ce2f","Type":"ContainerStarted","Data":"e9309269f0f1dc9428a4efabf8f93a559b7938fc9cfbbe86a63bb7bc27e15107"} Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.347655 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t77nk"] Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.349835 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.362075 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t77nk"] Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.428642 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjb6n\" (UniqueName: \"kubernetes.io/projected/cdee2ceb-f58d-469d-9428-44daeba832c7-kube-api-access-wjb6n\") pod \"redhat-operators-t77nk\" (UID: \"cdee2ceb-f58d-469d-9428-44daeba832c7\") " pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.428725 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdee2ceb-f58d-469d-9428-44daeba832c7-catalog-content\") pod \"redhat-operators-t77nk\" (UID: \"cdee2ceb-f58d-469d-9428-44daeba832c7\") " pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.428759 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdee2ceb-f58d-469d-9428-44daeba832c7-utilities\") pod \"redhat-operators-t77nk\" (UID: \"cdee2ceb-f58d-469d-9428-44daeba832c7\") " pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.433076 4742 generic.go:334] "Generic (PLEG): container finished" podID="20e57e18-cc27-4d2e-9207-e784beb4ce2f" containerID="396c7c2bde2b7f397641b935c0c3180ad9c5d5694e7e7b7692be75e70f92e544" exitCode=0 Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.433110 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" event={"ID":"20e57e18-cc27-4d2e-9207-e784beb4ce2f","Type":"ContainerDied","Data":"396c7c2bde2b7f397641b935c0c3180ad9c5d5694e7e7b7692be75e70f92e544"} Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.529931 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjb6n\" (UniqueName: \"kubernetes.io/projected/cdee2ceb-f58d-469d-9428-44daeba832c7-kube-api-access-wjb6n\") pod \"redhat-operators-t77nk\" (UID: \"cdee2ceb-f58d-469d-9428-44daeba832c7\") " pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.530222 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdee2ceb-f58d-469d-9428-44daeba832c7-catalog-content\") pod \"redhat-operators-t77nk\" (UID: \"cdee2ceb-f58d-469d-9428-44daeba832c7\") " pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.530321 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdee2ceb-f58d-469d-9428-44daeba832c7-utilities\") pod \"redhat-operators-t77nk\" (UID: \"cdee2ceb-f58d-469d-9428-44daeba832c7\") " pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.530744 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdee2ceb-f58d-469d-9428-44daeba832c7-catalog-content\") pod \"redhat-operators-t77nk\" (UID: \"cdee2ceb-f58d-469d-9428-44daeba832c7\") " pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.530872 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdee2ceb-f58d-469d-9428-44daeba832c7-utilities\") pod \"redhat-operators-t77nk\" (UID: \"cdee2ceb-f58d-469d-9428-44daeba832c7\") " pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.547488 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjb6n\" (UniqueName: \"kubernetes.io/projected/cdee2ceb-f58d-469d-9428-44daeba832c7-kube-api-access-wjb6n\") pod \"redhat-operators-t77nk\" (UID: \"cdee2ceb-f58d-469d-9428-44daeba832c7\") " pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.693849 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:25:55 crc kubenswrapper[4742]: I0317 11:25:55.914293 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t77nk"] Mar 17 11:25:55 crc kubenswrapper[4742]: W0317 11:25:55.917223 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdee2ceb_f58d_469d_9428_44daeba832c7.slice/crio-81eba4c0c48202b394a76f0de609b455386e9a0d004a7a707d4450277154fe5f WatchSource:0}: Error finding container 81eba4c0c48202b394a76f0de609b455386e9a0d004a7a707d4450277154fe5f: Status 404 returned error can't find the container with id 81eba4c0c48202b394a76f0de609b455386e9a0d004a7a707d4450277154fe5f Mar 17 11:25:56 crc kubenswrapper[4742]: I0317 11:25:56.442655 4742 generic.go:334] "Generic (PLEG): container finished" podID="cdee2ceb-f58d-469d-9428-44daeba832c7" containerID="0c3e2fb8b85bfe472cde5c59d64e169ac654993be2c3e16945f463110e59b3a6" exitCode=0 Mar 17 11:25:56 crc kubenswrapper[4742]: I0317 11:25:56.442739 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t77nk" event={"ID":"cdee2ceb-f58d-469d-9428-44daeba832c7","Type":"ContainerDied","Data":"0c3e2fb8b85bfe472cde5c59d64e169ac654993be2c3e16945f463110e59b3a6"} Mar 17 11:25:56 crc kubenswrapper[4742]: I0317 11:25:56.442975 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t77nk" event={"ID":"cdee2ceb-f58d-469d-9428-44daeba832c7","Type":"ContainerStarted","Data":"81eba4c0c48202b394a76f0de609b455386e9a0d004a7a707d4450277154fe5f"} Mar 17 11:25:57 crc kubenswrapper[4742]: I0317 11:25:57.455798 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t77nk" event={"ID":"cdee2ceb-f58d-469d-9428-44daeba832c7","Type":"ContainerStarted","Data":"f9fe58a89d181eba3feaeff6d95d0b992c00990ce020d4fea2c8f895fac1c45b"} Mar 17 11:25:57 crc kubenswrapper[4742]: I0317 11:25:57.991368 4742 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 17 11:25:58 crc kubenswrapper[4742]: I0317 11:25:58.465896 4742 generic.go:334] "Generic (PLEG): container finished" podID="20e57e18-cc27-4d2e-9207-e784beb4ce2f" containerID="f300433f2204372b93fde320768febfd9b965e9dddbec627892dc105e3acfbd8" exitCode=0 Mar 17 11:25:58 crc kubenswrapper[4742]: I0317 11:25:58.465993 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" event={"ID":"20e57e18-cc27-4d2e-9207-e784beb4ce2f","Type":"ContainerDied","Data":"f300433f2204372b93fde320768febfd9b965e9dddbec627892dc105e3acfbd8"} Mar 17 11:25:58 crc kubenswrapper[4742]: I0317 11:25:58.469500 4742 generic.go:334] "Generic (PLEG): container finished" podID="cdee2ceb-f58d-469d-9428-44daeba832c7" containerID="f9fe58a89d181eba3feaeff6d95d0b992c00990ce020d4fea2c8f895fac1c45b" exitCode=0 Mar 17 11:25:58 crc kubenswrapper[4742]: I0317 11:25:58.469537 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t77nk" event={"ID":"cdee2ceb-f58d-469d-9428-44daeba832c7","Type":"ContainerDied","Data":"f9fe58a89d181eba3feaeff6d95d0b992c00990ce020d4fea2c8f895fac1c45b"} Mar 17 11:25:59 crc kubenswrapper[4742]: I0317 11:25:59.485264 4742 generic.go:334] "Generic (PLEG): container finished" podID="20e57e18-cc27-4d2e-9207-e784beb4ce2f" containerID="35fc5f28ac19094fc45cfa6dfa457682bb9866a2d0521ab23aa6bd97573251a4" exitCode=0 Mar 17 11:25:59 crc kubenswrapper[4742]: I0317 11:25:59.485339 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" event={"ID":"20e57e18-cc27-4d2e-9207-e784beb4ce2f","Type":"ContainerDied","Data":"35fc5f28ac19094fc45cfa6dfa457682bb9866a2d0521ab23aa6bd97573251a4"} Mar 17 11:25:59 crc kubenswrapper[4742]: I0317 11:25:59.490254 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t77nk" event={"ID":"cdee2ceb-f58d-469d-9428-44daeba832c7","Type":"ContainerStarted","Data":"f16ff34a9173e378cce21e7d92d4501a273acd97fb3178634b64e404ea883aae"} Mar 17 11:25:59 crc kubenswrapper[4742]: I0317 11:25:59.541900 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t77nk" podStartSLOduration=2.142424455 podStartE2EDuration="4.54187734s" podCreationTimestamp="2026-03-17 11:25:55 +0000 UTC" firstStartedPulling="2026-03-17 11:25:56.444249258 +0000 UTC m=+859.570377026" lastFinishedPulling="2026-03-17 11:25:58.843702143 +0000 UTC m=+861.969829911" observedRunningTime="2026-03-17 11:25:59.538460397 +0000 UTC m=+862.664588185" watchObservedRunningTime="2026-03-17 11:25:59.54187734 +0000 UTC m=+862.668005108" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.145892 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562446-67mbt"] Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.147320 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562446-67mbt" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.153116 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562446-67mbt"] Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.180700 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.181157 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.181203 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.316069 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb8qg\" (UniqueName: \"kubernetes.io/projected/ea540b21-cd5f-485f-863c-24bc91beab7e-kube-api-access-qb8qg\") pod \"auto-csr-approver-29562446-67mbt\" (UID: \"ea540b21-cd5f-485f-863c-24bc91beab7e\") " pod="openshift-infra/auto-csr-approver-29562446-67mbt" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.417379 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb8qg\" (UniqueName: \"kubernetes.io/projected/ea540b21-cd5f-485f-863c-24bc91beab7e-kube-api-access-qb8qg\") pod \"auto-csr-approver-29562446-67mbt\" (UID: \"ea540b21-cd5f-485f-863c-24bc91beab7e\") " pod="openshift-infra/auto-csr-approver-29562446-67mbt" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.451876 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb8qg\" (UniqueName: \"kubernetes.io/projected/ea540b21-cd5f-485f-863c-24bc91beab7e-kube-api-access-qb8qg\") pod \"auto-csr-approver-29562446-67mbt\" (UID: \"ea540b21-cd5f-485f-863c-24bc91beab7e\") " pod="openshift-infra/auto-csr-approver-29562446-67mbt" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.494859 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562446-67mbt" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.719685 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562446-67mbt"] Mar 17 11:26:00 crc kubenswrapper[4742]: W0317 11:26:00.723172 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea540b21_cd5f_485f_863c_24bc91beab7e.slice/crio-c366d77f8584cfa5778de4a5fe1ef531f1a2faa50808094946d259af998d5ce8 WatchSource:0}: Error finding container c366d77f8584cfa5778de4a5fe1ef531f1a2faa50808094946d259af998d5ce8: Status 404 returned error can't find the container with id c366d77f8584cfa5778de4a5fe1ef531f1a2faa50808094946d259af998d5ce8 Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.742816 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.821725 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm6xl\" (UniqueName: \"kubernetes.io/projected/20e57e18-cc27-4d2e-9207-e784beb4ce2f-kube-api-access-pm6xl\") pod \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\" (UID: \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\") " Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.821786 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e57e18-cc27-4d2e-9207-e784beb4ce2f-util\") pod \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\" (UID: \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\") " Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.823036 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e57e18-cc27-4d2e-9207-e784beb4ce2f-bundle\") pod \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\" (UID: \"20e57e18-cc27-4d2e-9207-e784beb4ce2f\") " Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.823549 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e57e18-cc27-4d2e-9207-e784beb4ce2f-bundle" (OuterVolumeSpecName: "bundle") pod "20e57e18-cc27-4d2e-9207-e784beb4ce2f" (UID: "20e57e18-cc27-4d2e-9207-e784beb4ce2f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.825576 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e57e18-cc27-4d2e-9207-e784beb4ce2f-kube-api-access-pm6xl" (OuterVolumeSpecName: "kube-api-access-pm6xl") pod "20e57e18-cc27-4d2e-9207-e784beb4ce2f" (UID: "20e57e18-cc27-4d2e-9207-e784beb4ce2f"). InnerVolumeSpecName "kube-api-access-pm6xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.835676 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e57e18-cc27-4d2e-9207-e784beb4ce2f-util" (OuterVolumeSpecName: "util") pod "20e57e18-cc27-4d2e-9207-e784beb4ce2f" (UID: "20e57e18-cc27-4d2e-9207-e784beb4ce2f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.925389 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm6xl\" (UniqueName: \"kubernetes.io/projected/20e57e18-cc27-4d2e-9207-e784beb4ce2f-kube-api-access-pm6xl\") on node \"crc\" DevicePath \"\"" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.925448 4742 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20e57e18-cc27-4d2e-9207-e784beb4ce2f-util\") on node \"crc\" DevicePath \"\"" Mar 17 11:26:00 crc kubenswrapper[4742]: I0317 11:26:00.925469 4742 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20e57e18-cc27-4d2e-9207-e784beb4ce2f-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:26:01 crc kubenswrapper[4742]: I0317 11:26:01.501331 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562446-67mbt" event={"ID":"ea540b21-cd5f-485f-863c-24bc91beab7e","Type":"ContainerStarted","Data":"c366d77f8584cfa5778de4a5fe1ef531f1a2faa50808094946d259af998d5ce8"} Mar 17 11:26:01 crc kubenswrapper[4742]: I0317 11:26:01.503497 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" event={"ID":"20e57e18-cc27-4d2e-9207-e784beb4ce2f","Type":"ContainerDied","Data":"e9309269f0f1dc9428a4efabf8f93a559b7938fc9cfbbe86a63bb7bc27e15107"} Mar 17 11:26:01 crc kubenswrapper[4742]: I0317 11:26:01.503523 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9309269f0f1dc9428a4efabf8f93a559b7938fc9cfbbe86a63bb7bc27e15107" Mar 17 11:26:01 crc kubenswrapper[4742]: I0317 11:26:01.503619 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr" Mar 17 11:26:02 crc kubenswrapper[4742]: I0317 11:26:02.510705 4742 generic.go:334] "Generic (PLEG): container finished" podID="ea540b21-cd5f-485f-863c-24bc91beab7e" containerID="e9c18b34265ae35ef76b3212e8872778d92697fbfc3fb669589557c198c9c394" exitCode=0 Mar 17 11:26:02 crc kubenswrapper[4742]: I0317 11:26:02.510770 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562446-67mbt" event={"ID":"ea540b21-cd5f-485f-863c-24bc91beab7e","Type":"ContainerDied","Data":"e9c18b34265ae35ef76b3212e8872778d92697fbfc3fb669589557c198c9c394"} Mar 17 11:26:02 crc kubenswrapper[4742]: I0317 11:26:02.940838 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8lv9p"] Mar 17 11:26:02 crc kubenswrapper[4742]: E0317 11:26:02.941122 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e57e18-cc27-4d2e-9207-e784beb4ce2f" containerName="extract" Mar 17 11:26:02 crc kubenswrapper[4742]: I0317 11:26:02.941134 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e57e18-cc27-4d2e-9207-e784beb4ce2f" containerName="extract" Mar 17 11:26:02 crc kubenswrapper[4742]: E0317 11:26:02.941142 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e57e18-cc27-4d2e-9207-e784beb4ce2f" containerName="pull" Mar 17 11:26:02 crc kubenswrapper[4742]: I0317 11:26:02.941147 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e57e18-cc27-4d2e-9207-e784beb4ce2f" containerName="pull" Mar 17 11:26:02 crc kubenswrapper[4742]: E0317 11:26:02.941154 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e57e18-cc27-4d2e-9207-e784beb4ce2f" containerName="util" Mar 17 11:26:02 crc kubenswrapper[4742]: I0317 11:26:02.941160 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e57e18-cc27-4d2e-9207-e784beb4ce2f" containerName="util" Mar 17 11:26:02 crc kubenswrapper[4742]: I0317 11:26:02.941258 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e57e18-cc27-4d2e-9207-e784beb4ce2f" containerName="extract" Mar 17 11:26:02 crc kubenswrapper[4742]: I0317 11:26:02.941981 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:02 crc kubenswrapper[4742]: I0317 11:26:02.958556 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lv9p"] Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.053177 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-utilities\") pod \"redhat-marketplace-8lv9p\" (UID: \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\") " pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.053266 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-catalog-content\") pod \"redhat-marketplace-8lv9p\" (UID: \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\") " pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.053424 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km5fn\" (UniqueName: \"kubernetes.io/projected/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-kube-api-access-km5fn\") pod \"redhat-marketplace-8lv9p\" (UID: \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\") " pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.154780 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km5fn\" (UniqueName: \"kubernetes.io/projected/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-kube-api-access-km5fn\") pod \"redhat-marketplace-8lv9p\" (UID: \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\") " pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.154861 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-utilities\") pod \"redhat-marketplace-8lv9p\" (UID: \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\") " pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.154949 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-catalog-content\") pod \"redhat-marketplace-8lv9p\" (UID: \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\") " pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.155405 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-utilities\") pod \"redhat-marketplace-8lv9p\" (UID: \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\") " pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.155490 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-catalog-content\") pod \"redhat-marketplace-8lv9p\" (UID: \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\") " pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.177345 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km5fn\" (UniqueName: \"kubernetes.io/projected/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-kube-api-access-km5fn\") pod \"redhat-marketplace-8lv9p\" (UID: \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\") " pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.274426 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.489783 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lv9p"] Mar 17 11:26:03 crc kubenswrapper[4742]: W0317 11:26:03.500618 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b7a4b10_cdd4_4b30_95e2_6cc92bd81b85.slice/crio-8cfc4d41110105754151ad8ccf3d91fda9c354ed68b7597dc35e4be6c8ff3d68 WatchSource:0}: Error finding container 8cfc4d41110105754151ad8ccf3d91fda9c354ed68b7597dc35e4be6c8ff3d68: Status 404 returned error can't find the container with id 8cfc4d41110105754151ad8ccf3d91fda9c354ed68b7597dc35e4be6c8ff3d68 Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.525285 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lv9p" event={"ID":"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85","Type":"ContainerStarted","Data":"8cfc4d41110105754151ad8ccf3d91fda9c354ed68b7597dc35e4be6c8ff3d68"} Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.822892 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562446-67mbt" Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.969298 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb8qg\" (UniqueName: \"kubernetes.io/projected/ea540b21-cd5f-485f-863c-24bc91beab7e-kube-api-access-qb8qg\") pod \"ea540b21-cd5f-485f-863c-24bc91beab7e\" (UID: \"ea540b21-cd5f-485f-863c-24bc91beab7e\") " Mar 17 11:26:03 crc kubenswrapper[4742]: I0317 11:26:03.975215 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea540b21-cd5f-485f-863c-24bc91beab7e-kube-api-access-qb8qg" (OuterVolumeSpecName: "kube-api-access-qb8qg") pod "ea540b21-cd5f-485f-863c-24bc91beab7e" (UID: "ea540b21-cd5f-485f-863c-24bc91beab7e"). InnerVolumeSpecName "kube-api-access-qb8qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.070894 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb8qg\" (UniqueName: \"kubernetes.io/projected/ea540b21-cd5f-485f-863c-24bc91beab7e-kube-api-access-qb8qg\") on node \"crc\" DevicePath \"\"" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.532737 4742 generic.go:334] "Generic (PLEG): container finished" podID="9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" containerID="03c6bb2bd8d9cb9631d1e26b9143e2d89f9edd02333203333a2092a506ac4606" exitCode=0 Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.532787 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lv9p" event={"ID":"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85","Type":"ContainerDied","Data":"03c6bb2bd8d9cb9631d1e26b9143e2d89f9edd02333203333a2092a506ac4606"} Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.534425 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562446-67mbt" event={"ID":"ea540b21-cd5f-485f-863c-24bc91beab7e","Type":"ContainerDied","Data":"c366d77f8584cfa5778de4a5fe1ef531f1a2faa50808094946d259af998d5ce8"} Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.534457 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c366d77f8584cfa5778de4a5fe1ef531f1a2faa50808094946d259af998d5ce8" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.534504 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562446-67mbt" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.619390 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-7gfv5"] Mar 17 11:26:04 crc kubenswrapper[4742]: E0317 11:26:04.619869 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea540b21-cd5f-485f-863c-24bc91beab7e" containerName="oc" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.619880 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea540b21-cd5f-485f-863c-24bc91beab7e" containerName="oc" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.620301 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea540b21-cd5f-485f-863c-24bc91beab7e" containerName="oc" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.620695 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-7gfv5" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.624799 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-6rk45" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.624864 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.625085 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.629611 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-7gfv5"] Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.678340 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw78k\" (UniqueName: \"kubernetes.io/projected/8ec78658-d1d9-4fa9-953c-153e38522338-kube-api-access-lw78k\") pod \"nmstate-operator-796d4cfff4-7gfv5\" (UID: \"8ec78658-d1d9-4fa9-953c-153e38522338\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-7gfv5" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.779816 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw78k\" (UniqueName: \"kubernetes.io/projected/8ec78658-d1d9-4fa9-953c-153e38522338-kube-api-access-lw78k\") pod \"nmstate-operator-796d4cfff4-7gfv5\" (UID: \"8ec78658-d1d9-4fa9-953c-153e38522338\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-7gfv5" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.805955 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw78k\" (UniqueName: \"kubernetes.io/projected/8ec78658-d1d9-4fa9-953c-153e38522338-kube-api-access-lw78k\") pod \"nmstate-operator-796d4cfff4-7gfv5\" (UID: \"8ec78658-d1d9-4fa9-953c-153e38522338\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-7gfv5" Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.894666 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562440-nvhsf"] Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.898381 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562440-nvhsf"] Mar 17 11:26:04 crc kubenswrapper[4742]: I0317 11:26:04.937705 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-7gfv5" Mar 17 11:26:05 crc kubenswrapper[4742]: I0317 11:26:05.130511 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-7gfv5"] Mar 17 11:26:05 crc kubenswrapper[4742]: W0317 11:26:05.141107 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ec78658_d1d9_4fa9_953c_153e38522338.slice/crio-4b4c8689eb037d4ffd3ff1f066a9c946b95a6ebfb04fa1059350d2f4eb7d370e WatchSource:0}: Error finding container 4b4c8689eb037d4ffd3ff1f066a9c946b95a6ebfb04fa1059350d2f4eb7d370e: Status 404 returned error can't find the container with id 4b4c8689eb037d4ffd3ff1f066a9c946b95a6ebfb04fa1059350d2f4eb7d370e Mar 17 11:26:05 crc kubenswrapper[4742]: I0317 11:26:05.543703 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-7gfv5" event={"ID":"8ec78658-d1d9-4fa9-953c-153e38522338","Type":"ContainerStarted","Data":"4b4c8689eb037d4ffd3ff1f066a9c946b95a6ebfb04fa1059350d2f4eb7d370e"} Mar 17 11:26:05 crc kubenswrapper[4742]: I0317 11:26:05.695225 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:26:05 crc kubenswrapper[4742]: I0317 11:26:05.696082 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:26:06 crc kubenswrapper[4742]: I0317 11:26:06.553170 4742 generic.go:334] "Generic (PLEG): container finished" podID="9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" containerID="a6fb7a914d7ea2f84142bb01dad8aefd7ed346e8ee0599fe0a4c4e3d7987847f" exitCode=0 Mar 17 11:26:06 crc kubenswrapper[4742]: I0317 11:26:06.553222 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lv9p" event={"ID":"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85","Type":"ContainerDied","Data":"a6fb7a914d7ea2f84142bb01dad8aefd7ed346e8ee0599fe0a4c4e3d7987847f"} Mar 17 11:26:06 crc kubenswrapper[4742]: I0317 11:26:06.669393 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9cb0185-95a0-4694-8dcd-b75801842648" path="/var/lib/kubelet/pods/b9cb0185-95a0-4694-8dcd-b75801842648/volumes" Mar 17 11:26:06 crc kubenswrapper[4742]: I0317 11:26:06.736622 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t77nk" podUID="cdee2ceb-f58d-469d-9428-44daeba832c7" containerName="registry-server" probeResult="failure" output=< Mar 17 11:26:06 crc kubenswrapper[4742]: timeout: failed to connect service ":50051" within 1s Mar 17 11:26:06 crc kubenswrapper[4742]: > Mar 17 11:26:07 crc kubenswrapper[4742]: I0317 11:26:07.561309 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lv9p" event={"ID":"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85","Type":"ContainerStarted","Data":"cda9947374d5e4a7c36eb1bffa9a6434b28852e2712e6f5cdba8741cc1a37cb8"} Mar 17 11:26:07 crc kubenswrapper[4742]: I0317 11:26:07.583491 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8lv9p" podStartSLOduration=3.128229883 podStartE2EDuration="5.583471603s" podCreationTimestamp="2026-03-17 11:26:02 +0000 UTC" firstStartedPulling="2026-03-17 11:26:04.535374956 +0000 UTC m=+867.661502724" lastFinishedPulling="2026-03-17 11:26:06.990616666 +0000 UTC m=+870.116744444" observedRunningTime="2026-03-17 11:26:07.580115981 +0000 UTC m=+870.706243759" watchObservedRunningTime="2026-03-17 11:26:07.583471603 +0000 UTC m=+870.709599361" Mar 17 11:26:08 crc kubenswrapper[4742]: I0317 11:26:08.568333 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-7gfv5" event={"ID":"8ec78658-d1d9-4fa9-953c-153e38522338","Type":"ContainerStarted","Data":"25789add5326ff1c9b7c01f89095ef0b3d2f66e8158fbd527bd4711cc05b5068"} Mar 17 11:26:08 crc kubenswrapper[4742]: I0317 11:26:08.597867 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-7gfv5" podStartSLOduration=2.153955617 podStartE2EDuration="4.597849179s" podCreationTimestamp="2026-03-17 11:26:04 +0000 UTC" firstStartedPulling="2026-03-17 11:26:05.143173848 +0000 UTC m=+868.269301606" lastFinishedPulling="2026-03-17 11:26:07.58706741 +0000 UTC m=+870.713195168" observedRunningTime="2026-03-17 11:26:08.59086977 +0000 UTC m=+871.716997538" watchObservedRunningTime="2026-03-17 11:26:08.597849179 +0000 UTC m=+871.723976957" Mar 17 11:26:13 crc kubenswrapper[4742]: I0317 11:26:13.275278 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:13 crc kubenswrapper[4742]: I0317 11:26:13.275796 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:13 crc kubenswrapper[4742]: I0317 11:26:13.329738 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:13 crc kubenswrapper[4742]: I0317 11:26:13.668897 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:14 crc kubenswrapper[4742]: I0317 11:26:14.327132 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lv9p"] Mar 17 11:26:15 crc kubenswrapper[4742]: I0317 11:26:15.616189 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8lv9p" podUID="9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" containerName="registry-server" containerID="cri-o://cda9947374d5e4a7c36eb1bffa9a6434b28852e2712e6f5cdba8741cc1a37cb8" gracePeriod=2 Mar 17 11:26:15 crc kubenswrapper[4742]: I0317 11:26:15.769490 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:26:15 crc kubenswrapper[4742]: I0317 11:26:15.832473 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.034455 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.141092 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-catalog-content\") pod \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\" (UID: \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\") " Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.141143 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-utilities\") pod \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\" (UID: \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\") " Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.141221 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km5fn\" (UniqueName: \"kubernetes.io/projected/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-kube-api-access-km5fn\") pod \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\" (UID: \"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85\") " Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.142853 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-utilities" (OuterVolumeSpecName: "utilities") pod "9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" (UID: "9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.150336 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-kube-api-access-km5fn" (OuterVolumeSpecName: "kube-api-access-km5fn") pod "9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" (UID: "9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85"). InnerVolumeSpecName "kube-api-access-km5fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.180542 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" (UID: "9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.243158 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km5fn\" (UniqueName: \"kubernetes.io/projected/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-kube-api-access-km5fn\") on node \"crc\" DevicePath \"\"" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.243211 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.243223 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.632278 4742 generic.go:334] "Generic (PLEG): container finished" podID="9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" containerID="cda9947374d5e4a7c36eb1bffa9a6434b28852e2712e6f5cdba8741cc1a37cb8" exitCode=0 Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.633414 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8lv9p" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.633615 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lv9p" event={"ID":"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85","Type":"ContainerDied","Data":"cda9947374d5e4a7c36eb1bffa9a6434b28852e2712e6f5cdba8741cc1a37cb8"} Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.633721 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lv9p" event={"ID":"9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85","Type":"ContainerDied","Data":"8cfc4d41110105754151ad8ccf3d91fda9c354ed68b7597dc35e4be6c8ff3d68"} Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.633764 4742 scope.go:117] "RemoveContainer" containerID="cda9947374d5e4a7c36eb1bffa9a6434b28852e2712e6f5cdba8741cc1a37cb8" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.680669 4742 scope.go:117] "RemoveContainer" containerID="a6fb7a914d7ea2f84142bb01dad8aefd7ed346e8ee0599fe0a4c4e3d7987847f" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.683364 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lv9p"] Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.690356 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lv9p"] Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.702809 4742 scope.go:117] "RemoveContainer" containerID="03c6bb2bd8d9cb9631d1e26b9143e2d89f9edd02333203333a2092a506ac4606" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.725697 4742 scope.go:117] "RemoveContainer" containerID="cda9947374d5e4a7c36eb1bffa9a6434b28852e2712e6f5cdba8741cc1a37cb8" Mar 17 11:26:16 crc kubenswrapper[4742]: E0317 11:26:16.726585 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cda9947374d5e4a7c36eb1bffa9a6434b28852e2712e6f5cdba8741cc1a37cb8\": container with ID starting with cda9947374d5e4a7c36eb1bffa9a6434b28852e2712e6f5cdba8741cc1a37cb8 not found: ID does not exist" containerID="cda9947374d5e4a7c36eb1bffa9a6434b28852e2712e6f5cdba8741cc1a37cb8" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.726683 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cda9947374d5e4a7c36eb1bffa9a6434b28852e2712e6f5cdba8741cc1a37cb8"} err="failed to get container status \"cda9947374d5e4a7c36eb1bffa9a6434b28852e2712e6f5cdba8741cc1a37cb8\": rpc error: code = NotFound desc = could not find container \"cda9947374d5e4a7c36eb1bffa9a6434b28852e2712e6f5cdba8741cc1a37cb8\": container with ID starting with cda9947374d5e4a7c36eb1bffa9a6434b28852e2712e6f5cdba8741cc1a37cb8 not found: ID does not exist" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.726760 4742 scope.go:117] "RemoveContainer" containerID="a6fb7a914d7ea2f84142bb01dad8aefd7ed346e8ee0599fe0a4c4e3d7987847f" Mar 17 11:26:16 crc kubenswrapper[4742]: E0317 11:26:16.727509 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6fb7a914d7ea2f84142bb01dad8aefd7ed346e8ee0599fe0a4c4e3d7987847f\": container with ID starting with a6fb7a914d7ea2f84142bb01dad8aefd7ed346e8ee0599fe0a4c4e3d7987847f not found: ID does not exist" containerID="a6fb7a914d7ea2f84142bb01dad8aefd7ed346e8ee0599fe0a4c4e3d7987847f" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.727566 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6fb7a914d7ea2f84142bb01dad8aefd7ed346e8ee0599fe0a4c4e3d7987847f"} err="failed to get container status \"a6fb7a914d7ea2f84142bb01dad8aefd7ed346e8ee0599fe0a4c4e3d7987847f\": rpc error: code = NotFound desc = could not find container \"a6fb7a914d7ea2f84142bb01dad8aefd7ed346e8ee0599fe0a4c4e3d7987847f\": container with ID starting with a6fb7a914d7ea2f84142bb01dad8aefd7ed346e8ee0599fe0a4c4e3d7987847f not found: ID does not exist" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.727606 4742 scope.go:117] "RemoveContainer" containerID="03c6bb2bd8d9cb9631d1e26b9143e2d89f9edd02333203333a2092a506ac4606" Mar 17 11:26:16 crc kubenswrapper[4742]: E0317 11:26:16.728467 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c6bb2bd8d9cb9631d1e26b9143e2d89f9edd02333203333a2092a506ac4606\": container with ID starting with 03c6bb2bd8d9cb9631d1e26b9143e2d89f9edd02333203333a2092a506ac4606 not found: ID does not exist" containerID="03c6bb2bd8d9cb9631d1e26b9143e2d89f9edd02333203333a2092a506ac4606" Mar 17 11:26:16 crc kubenswrapper[4742]: I0317 11:26:16.728511 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c6bb2bd8d9cb9631d1e26b9143e2d89f9edd02333203333a2092a506ac4606"} err="failed to get container status \"03c6bb2bd8d9cb9631d1e26b9143e2d89f9edd02333203333a2092a506ac4606\": rpc error: code = NotFound desc = could not find container \"03c6bb2bd8d9cb9631d1e26b9143e2d89f9edd02333203333a2092a506ac4606\": container with ID starting with 03c6bb2bd8d9cb9631d1e26b9143e2d89f9edd02333203333a2092a506ac4606 not found: ID does not exist" Mar 17 11:26:17 crc kubenswrapper[4742]: I0317 11:26:17.328536 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t77nk"] Mar 17 11:26:17 crc kubenswrapper[4742]: I0317 11:26:17.644177 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t77nk" podUID="cdee2ceb-f58d-469d-9428-44daeba832c7" containerName="registry-server" containerID="cri-o://f16ff34a9173e378cce21e7d92d4501a273acd97fb3178634b64e404ea883aae" gracePeriod=2 Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.087502 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.167139 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjb6n\" (UniqueName: \"kubernetes.io/projected/cdee2ceb-f58d-469d-9428-44daeba832c7-kube-api-access-wjb6n\") pod \"cdee2ceb-f58d-469d-9428-44daeba832c7\" (UID: \"cdee2ceb-f58d-469d-9428-44daeba832c7\") " Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.167246 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdee2ceb-f58d-469d-9428-44daeba832c7-catalog-content\") pod \"cdee2ceb-f58d-469d-9428-44daeba832c7\" (UID: \"cdee2ceb-f58d-469d-9428-44daeba832c7\") " Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.167357 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdee2ceb-f58d-469d-9428-44daeba832c7-utilities\") pod \"cdee2ceb-f58d-469d-9428-44daeba832c7\" (UID: \"cdee2ceb-f58d-469d-9428-44daeba832c7\") " Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.169371 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdee2ceb-f58d-469d-9428-44daeba832c7-utilities" (OuterVolumeSpecName: "utilities") pod "cdee2ceb-f58d-469d-9428-44daeba832c7" (UID: "cdee2ceb-f58d-469d-9428-44daeba832c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.172872 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdee2ceb-f58d-469d-9428-44daeba832c7-kube-api-access-wjb6n" (OuterVolumeSpecName: "kube-api-access-wjb6n") pod "cdee2ceb-f58d-469d-9428-44daeba832c7" (UID: "cdee2ceb-f58d-469d-9428-44daeba832c7"). InnerVolumeSpecName "kube-api-access-wjb6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.269583 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjb6n\" (UniqueName: \"kubernetes.io/projected/cdee2ceb-f58d-469d-9428-44daeba832c7-kube-api-access-wjb6n\") on node \"crc\" DevicePath \"\"" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.269620 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdee2ceb-f58d-469d-9428-44daeba832c7-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.336112 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdee2ceb-f58d-469d-9428-44daeba832c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdee2ceb-f58d-469d-9428-44daeba832c7" (UID: "cdee2ceb-f58d-469d-9428-44daeba832c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.371525 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdee2ceb-f58d-469d-9428-44daeba832c7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.655453 4742 generic.go:334] "Generic (PLEG): container finished" podID="cdee2ceb-f58d-469d-9428-44daeba832c7" containerID="f16ff34a9173e378cce21e7d92d4501a273acd97fb3178634b64e404ea883aae" exitCode=0 Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.655532 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t77nk" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.655528 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t77nk" event={"ID":"cdee2ceb-f58d-469d-9428-44daeba832c7","Type":"ContainerDied","Data":"f16ff34a9173e378cce21e7d92d4501a273acd97fb3178634b64e404ea883aae"} Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.656035 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t77nk" event={"ID":"cdee2ceb-f58d-469d-9428-44daeba832c7","Type":"ContainerDied","Data":"81eba4c0c48202b394a76f0de609b455386e9a0d004a7a707d4450277154fe5f"} Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.656057 4742 scope.go:117] "RemoveContainer" containerID="f16ff34a9173e378cce21e7d92d4501a273acd97fb3178634b64e404ea883aae" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.683648 4742 scope.go:117] "RemoveContainer" containerID="f9fe58a89d181eba3feaeff6d95d0b992c00990ce020d4fea2c8f895fac1c45b" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.683774 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" path="/var/lib/kubelet/pods/9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85/volumes" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.699389 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t77nk"] Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.706853 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t77nk"] Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.722966 4742 scope.go:117] "RemoveContainer" containerID="0c3e2fb8b85bfe472cde5c59d64e169ac654993be2c3e16945f463110e59b3a6" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.749236 4742 scope.go:117] "RemoveContainer" containerID="f16ff34a9173e378cce21e7d92d4501a273acd97fb3178634b64e404ea883aae" Mar 17 11:26:18 crc kubenswrapper[4742]: E0317 11:26:18.749821 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f16ff34a9173e378cce21e7d92d4501a273acd97fb3178634b64e404ea883aae\": container with ID starting with f16ff34a9173e378cce21e7d92d4501a273acd97fb3178634b64e404ea883aae not found: ID does not exist" containerID="f16ff34a9173e378cce21e7d92d4501a273acd97fb3178634b64e404ea883aae" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.749877 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f16ff34a9173e378cce21e7d92d4501a273acd97fb3178634b64e404ea883aae"} err="failed to get container status \"f16ff34a9173e378cce21e7d92d4501a273acd97fb3178634b64e404ea883aae\": rpc error: code = NotFound desc = could not find container \"f16ff34a9173e378cce21e7d92d4501a273acd97fb3178634b64e404ea883aae\": container with ID starting with f16ff34a9173e378cce21e7d92d4501a273acd97fb3178634b64e404ea883aae not found: ID does not exist" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.749947 4742 scope.go:117] "RemoveContainer" containerID="f9fe58a89d181eba3feaeff6d95d0b992c00990ce020d4fea2c8f895fac1c45b" Mar 17 11:26:18 crc kubenswrapper[4742]: E0317 11:26:18.750408 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9fe58a89d181eba3feaeff6d95d0b992c00990ce020d4fea2c8f895fac1c45b\": container with ID starting with f9fe58a89d181eba3feaeff6d95d0b992c00990ce020d4fea2c8f895fac1c45b not found: ID does not exist" containerID="f9fe58a89d181eba3feaeff6d95d0b992c00990ce020d4fea2c8f895fac1c45b" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.750438 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9fe58a89d181eba3feaeff6d95d0b992c00990ce020d4fea2c8f895fac1c45b"} err="failed to get container status \"f9fe58a89d181eba3feaeff6d95d0b992c00990ce020d4fea2c8f895fac1c45b\": rpc error: code = NotFound desc = could not find container \"f9fe58a89d181eba3feaeff6d95d0b992c00990ce020d4fea2c8f895fac1c45b\": container with ID starting with f9fe58a89d181eba3feaeff6d95d0b992c00990ce020d4fea2c8f895fac1c45b not found: ID does not exist" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.750457 4742 scope.go:117] "RemoveContainer" containerID="0c3e2fb8b85bfe472cde5c59d64e169ac654993be2c3e16945f463110e59b3a6" Mar 17 11:26:18 crc kubenswrapper[4742]: E0317 11:26:18.751094 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c3e2fb8b85bfe472cde5c59d64e169ac654993be2c3e16945f463110e59b3a6\": container with ID starting with 0c3e2fb8b85bfe472cde5c59d64e169ac654993be2c3e16945f463110e59b3a6 not found: ID does not exist" containerID="0c3e2fb8b85bfe472cde5c59d64e169ac654993be2c3e16945f463110e59b3a6" Mar 17 11:26:18 crc kubenswrapper[4742]: I0317 11:26:18.751307 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c3e2fb8b85bfe472cde5c59d64e169ac654993be2c3e16945f463110e59b3a6"} err="failed to get container status \"0c3e2fb8b85bfe472cde5c59d64e169ac654993be2c3e16945f463110e59b3a6\": rpc error: code = NotFound desc = could not find container \"0c3e2fb8b85bfe472cde5c59d64e169ac654993be2c3e16945f463110e59b3a6\": container with ID starting with 0c3e2fb8b85bfe472cde5c59d64e169ac654993be2c3e16945f463110e59b3a6 not found: ID does not exist" Mar 17 11:26:20 crc kubenswrapper[4742]: I0317 11:26:20.676239 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdee2ceb-f58d-469d-9428-44daeba832c7" path="/var/lib/kubelet/pods/cdee2ceb-f58d-469d-9428-44daeba832c7/volumes" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.190133 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-ttrvv"] Mar 17 11:26:35 crc kubenswrapper[4742]: E0317 11:26:35.191073 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" containerName="extract-content" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.191093 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" containerName="extract-content" Mar 17 11:26:35 crc kubenswrapper[4742]: E0317 11:26:35.191111 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" containerName="extract-utilities" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.191121 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" containerName="extract-utilities" Mar 17 11:26:35 crc kubenswrapper[4742]: E0317 11:26:35.191132 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdee2ceb-f58d-469d-9428-44daeba832c7" containerName="extract-utilities" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.191142 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdee2ceb-f58d-469d-9428-44daeba832c7" containerName="extract-utilities" Mar 17 11:26:35 crc kubenswrapper[4742]: E0317 11:26:35.191158 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdee2ceb-f58d-469d-9428-44daeba832c7" containerName="extract-content" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.191168 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdee2ceb-f58d-469d-9428-44daeba832c7" containerName="extract-content" Mar 17 11:26:35 crc kubenswrapper[4742]: E0317 11:26:35.191182 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" containerName="registry-server" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.191217 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" containerName="registry-server" Mar 17 11:26:35 crc kubenswrapper[4742]: E0317 11:26:35.191244 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdee2ceb-f58d-469d-9428-44daeba832c7" containerName="registry-server" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.191255 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdee2ceb-f58d-469d-9428-44daeba832c7" containerName="registry-server" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.191394 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7a4b10-cdd4-4b30-95e2-6cc92bd81b85" containerName="registry-server" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.191412 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdee2ceb-f58d-469d-9428-44daeba832c7" containerName="registry-server" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.192430 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ttrvv" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.194340 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-jbk25" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.212695 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb"] Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.213563 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.217578 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.220357 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-ttrvv"] Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.249493 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb"] Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.262536 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-b78m6"] Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.263378 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.332107 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4"] Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.332833 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.335710 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.336149 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.336224 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-fcx8p" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.356023 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6b4v\" (UniqueName: \"kubernetes.io/projected/36b76368-76e0-42c0-944f-c799a074ff7f-kube-api-access-q6b4v\") pod \"nmstate-metrics-9b8c8685d-ttrvv\" (UID: \"36b76368-76e0-42c0-944f-c799a074ff7f\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ttrvv" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.356073 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-cn8bb\" (UID: \"b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.356103 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6fpp\" (UniqueName: \"kubernetes.io/projected/b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b-kube-api-access-b6fpp\") pod \"nmstate-webhook-5f558f5558-cn8bb\" (UID: \"b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.362622 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4"] Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.456996 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a9d77ceb-2194-4bf6-809d-30ebc45c4dba-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-7wxg4\" (UID: \"a9d77ceb-2194-4bf6-809d-30ebc45c4dba\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.457032 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc4w9\" (UniqueName: \"kubernetes.io/projected/15a73401-5a6e-4a32-99ba-4efe8182c160-kube-api-access-kc4w9\") pod \"nmstate-handler-b78m6\" (UID: \"15a73401-5a6e-4a32-99ba-4efe8182c160\") " pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.457057 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/15a73401-5a6e-4a32-99ba-4efe8182c160-ovs-socket\") pod \"nmstate-handler-b78m6\" (UID: \"15a73401-5a6e-4a32-99ba-4efe8182c160\") " pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.457092 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/15a73401-5a6e-4a32-99ba-4efe8182c160-dbus-socket\") pod \"nmstate-handler-b78m6\" (UID: \"15a73401-5a6e-4a32-99ba-4efe8182c160\") " pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.457107 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhrfz\" (UniqueName: \"kubernetes.io/projected/a9d77ceb-2194-4bf6-809d-30ebc45c4dba-kube-api-access-jhrfz\") pod \"nmstate-console-plugin-86f58fcf4-7wxg4\" (UID: \"a9d77ceb-2194-4bf6-809d-30ebc45c4dba\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.457136 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/15a73401-5a6e-4a32-99ba-4efe8182c160-nmstate-lock\") pod \"nmstate-handler-b78m6\" (UID: \"15a73401-5a6e-4a32-99ba-4efe8182c160\") " pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.457267 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9d77ceb-2194-4bf6-809d-30ebc45c4dba-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-7wxg4\" (UID: \"a9d77ceb-2194-4bf6-809d-30ebc45c4dba\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.457358 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6b4v\" (UniqueName: \"kubernetes.io/projected/36b76368-76e0-42c0-944f-c799a074ff7f-kube-api-access-q6b4v\") pod \"nmstate-metrics-9b8c8685d-ttrvv\" (UID: \"36b76368-76e0-42c0-944f-c799a074ff7f\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ttrvv" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.457397 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-cn8bb\" (UID: \"b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.457639 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6fpp\" (UniqueName: \"kubernetes.io/projected/b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b-kube-api-access-b6fpp\") pod \"nmstate-webhook-5f558f5558-cn8bb\" (UID: \"b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.467616 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-cn8bb\" (UID: \"b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.474856 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6fpp\" (UniqueName: \"kubernetes.io/projected/b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b-kube-api-access-b6fpp\") pod \"nmstate-webhook-5f558f5558-cn8bb\" (UID: \"b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.480622 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6b4v\" (UniqueName: \"kubernetes.io/projected/36b76368-76e0-42c0-944f-c799a074ff7f-kube-api-access-q6b4v\") pod \"nmstate-metrics-9b8c8685d-ttrvv\" (UID: \"36b76368-76e0-42c0-944f-c799a074ff7f\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ttrvv" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.508709 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6ffbdfddfb-jgw5h"] Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.509379 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.525109 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ffbdfddfb-jgw5h"] Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.526295 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ttrvv" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.548559 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.564850 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9d77ceb-2194-4bf6-809d-30ebc45c4dba-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-7wxg4\" (UID: \"a9d77ceb-2194-4bf6-809d-30ebc45c4dba\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.564935 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/116a23d5-6ea4-4b98-a376-b9291413ba68-oauth-serving-cert\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.564975 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/116a23d5-6ea4-4b98-a376-b9291413ba68-console-serving-cert\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565016 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a9d77ceb-2194-4bf6-809d-30ebc45c4dba-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-7wxg4\" (UID: \"a9d77ceb-2194-4bf6-809d-30ebc45c4dba\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565040 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc4w9\" (UniqueName: \"kubernetes.io/projected/15a73401-5a6e-4a32-99ba-4efe8182c160-kube-api-access-kc4w9\") pod \"nmstate-handler-b78m6\" (UID: \"15a73401-5a6e-4a32-99ba-4efe8182c160\") " pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565071 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/116a23d5-6ea4-4b98-a376-b9291413ba68-service-ca\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565097 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/116a23d5-6ea4-4b98-a376-b9291413ba68-console-config\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565125 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/15a73401-5a6e-4a32-99ba-4efe8182c160-ovs-socket\") pod \"nmstate-handler-b78m6\" (UID: \"15a73401-5a6e-4a32-99ba-4efe8182c160\") " pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565161 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/116a23d5-6ea4-4b98-a376-b9291413ba68-console-oauth-config\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565191 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/15a73401-5a6e-4a32-99ba-4efe8182c160-dbus-socket\") pod \"nmstate-handler-b78m6\" (UID: \"15a73401-5a6e-4a32-99ba-4efe8182c160\") " pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565214 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhrfz\" (UniqueName: \"kubernetes.io/projected/a9d77ceb-2194-4bf6-809d-30ebc45c4dba-kube-api-access-jhrfz\") pod \"nmstate-console-plugin-86f58fcf4-7wxg4\" (UID: \"a9d77ceb-2194-4bf6-809d-30ebc45c4dba\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565251 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/15a73401-5a6e-4a32-99ba-4efe8182c160-nmstate-lock\") pod \"nmstate-handler-b78m6\" (UID: \"15a73401-5a6e-4a32-99ba-4efe8182c160\") " pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565275 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/116a23d5-6ea4-4b98-a376-b9291413ba68-trusted-ca-bundle\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565300 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swmlt\" (UniqueName: \"kubernetes.io/projected/116a23d5-6ea4-4b98-a376-b9291413ba68-kube-api-access-swmlt\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565396 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/15a73401-5a6e-4a32-99ba-4efe8182c160-ovs-socket\") pod \"nmstate-handler-b78m6\" (UID: \"15a73401-5a6e-4a32-99ba-4efe8182c160\") " pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565708 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/15a73401-5a6e-4a32-99ba-4efe8182c160-dbus-socket\") pod \"nmstate-handler-b78m6\" (UID: \"15a73401-5a6e-4a32-99ba-4efe8182c160\") " pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565843 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/15a73401-5a6e-4a32-99ba-4efe8182c160-nmstate-lock\") pod \"nmstate-handler-b78m6\" (UID: \"15a73401-5a6e-4a32-99ba-4efe8182c160\") " pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.565856 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a9d77ceb-2194-4bf6-809d-30ebc45c4dba-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-7wxg4\" (UID: \"a9d77ceb-2194-4bf6-809d-30ebc45c4dba\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.573494 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9d77ceb-2194-4bf6-809d-30ebc45c4dba-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-7wxg4\" (UID: \"a9d77ceb-2194-4bf6-809d-30ebc45c4dba\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.580957 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc4w9\" (UniqueName: \"kubernetes.io/projected/15a73401-5a6e-4a32-99ba-4efe8182c160-kube-api-access-kc4w9\") pod \"nmstate-handler-b78m6\" (UID: \"15a73401-5a6e-4a32-99ba-4efe8182c160\") " pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.583675 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhrfz\" (UniqueName: \"kubernetes.io/projected/a9d77ceb-2194-4bf6-809d-30ebc45c4dba-kube-api-access-jhrfz\") pod \"nmstate-console-plugin-86f58fcf4-7wxg4\" (UID: \"a9d77ceb-2194-4bf6-809d-30ebc45c4dba\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.646659 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.667129 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/116a23d5-6ea4-4b98-a376-b9291413ba68-service-ca\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.667194 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/116a23d5-6ea4-4b98-a376-b9291413ba68-console-config\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.667228 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/116a23d5-6ea4-4b98-a376-b9291413ba68-console-oauth-config\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.667264 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/116a23d5-6ea4-4b98-a376-b9291413ba68-trusted-ca-bundle\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.667279 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swmlt\" (UniqueName: \"kubernetes.io/projected/116a23d5-6ea4-4b98-a376-b9291413ba68-kube-api-access-swmlt\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.667304 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/116a23d5-6ea4-4b98-a376-b9291413ba68-oauth-serving-cert\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.667324 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/116a23d5-6ea4-4b98-a376-b9291413ba68-console-serving-cert\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.668482 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/116a23d5-6ea4-4b98-a376-b9291413ba68-console-config\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.668834 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/116a23d5-6ea4-4b98-a376-b9291413ba68-service-ca\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.669799 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/116a23d5-6ea4-4b98-a376-b9291413ba68-trusted-ca-bundle\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.669818 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/116a23d5-6ea4-4b98-a376-b9291413ba68-oauth-serving-cert\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.672148 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/116a23d5-6ea4-4b98-a376-b9291413ba68-console-serving-cert\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.675068 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/116a23d5-6ea4-4b98-a376-b9291413ba68-console-oauth-config\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.698602 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swmlt\" (UniqueName: \"kubernetes.io/projected/116a23d5-6ea4-4b98-a376-b9291413ba68-kube-api-access-swmlt\") pod \"console-6ffbdfddfb-jgw5h\" (UID: \"116a23d5-6ea4-4b98-a376-b9291413ba68\") " pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.720726 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-ttrvv"] Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.767809 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb"] Mar 17 11:26:35 crc kubenswrapper[4742]: W0317 11:26:35.777049 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2286c3d_e7d9_4ab5_827b_e6f7b9453a5b.slice/crio-56b4f5b80d2640e0d8d8ff5ff245949be19271df1d6bab78be4d1aab33c9d29b WatchSource:0}: Error finding container 56b4f5b80d2640e0d8d8ff5ff245949be19271df1d6bab78be4d1aab33c9d29b: Status 404 returned error can't find the container with id 56b4f5b80d2640e0d8d8ff5ff245949be19271df1d6bab78be4d1aab33c9d29b Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.792529 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb" event={"ID":"b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b","Type":"ContainerStarted","Data":"56b4f5b80d2640e0d8d8ff5ff245949be19271df1d6bab78be4d1aab33c9d29b"} Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.793801 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ttrvv" event={"ID":"36b76368-76e0-42c0-944f-c799a074ff7f","Type":"ContainerStarted","Data":"e53605d5e6c2e96209936a37e947eb89f5fd0eec17ac4df3ac9d5d7fd54babd9"} Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.830133 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.831559 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4"] Mar 17 11:26:35 crc kubenswrapper[4742]: I0317 11:26:35.880535 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:35 crc kubenswrapper[4742]: W0317 11:26:35.918663 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15a73401_5a6e_4a32_99ba_4efe8182c160.slice/crio-f18b54b166855311d06e02b506123619272040852f7edb90a12c9f508e42e707 WatchSource:0}: Error finding container f18b54b166855311d06e02b506123619272040852f7edb90a12c9f508e42e707: Status 404 returned error can't find the container with id f18b54b166855311d06e02b506123619272040852f7edb90a12c9f508e42e707 Mar 17 11:26:36 crc kubenswrapper[4742]: I0317 11:26:36.025387 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ffbdfddfb-jgw5h"] Mar 17 11:26:36 crc kubenswrapper[4742]: W0317 11:26:36.030879 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod116a23d5_6ea4_4b98_a376_b9291413ba68.slice/crio-a0f61c6443c31ca848fb605e8a02dd30371e284f31c05ef14e74621285b6e50f WatchSource:0}: Error finding container a0f61c6443c31ca848fb605e8a02dd30371e284f31c05ef14e74621285b6e50f: Status 404 returned error can't find the container with id a0f61c6443c31ca848fb605e8a02dd30371e284f31c05ef14e74621285b6e50f Mar 17 11:26:36 crc kubenswrapper[4742]: I0317 11:26:36.802484 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b78m6" event={"ID":"15a73401-5a6e-4a32-99ba-4efe8182c160","Type":"ContainerStarted","Data":"f18b54b166855311d06e02b506123619272040852f7edb90a12c9f508e42e707"} Mar 17 11:26:36 crc kubenswrapper[4742]: I0317 11:26:36.807712 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ffbdfddfb-jgw5h" event={"ID":"116a23d5-6ea4-4b98-a376-b9291413ba68","Type":"ContainerStarted","Data":"b9960c2193a59bc52c53ec83cad13818afec5c2fb702d51e65aa917031e17b3a"} Mar 17 11:26:36 crc kubenswrapper[4742]: I0317 11:26:36.807785 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ffbdfddfb-jgw5h" event={"ID":"116a23d5-6ea4-4b98-a376-b9291413ba68","Type":"ContainerStarted","Data":"a0f61c6443c31ca848fb605e8a02dd30371e284f31c05ef14e74621285b6e50f"} Mar 17 11:26:36 crc kubenswrapper[4742]: I0317 11:26:36.812721 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" event={"ID":"a9d77ceb-2194-4bf6-809d-30ebc45c4dba","Type":"ContainerStarted","Data":"1de5bf0c8c570c5ab175de952e900a5ce196b914c67c7bf9824ea0d553ca74d2"} Mar 17 11:26:36 crc kubenswrapper[4742]: I0317 11:26:36.830056 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6ffbdfddfb-jgw5h" podStartSLOduration=1.8300304889999999 podStartE2EDuration="1.830030489s" podCreationTimestamp="2026-03-17 11:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:26:36.828854517 +0000 UTC m=+899.954982315" watchObservedRunningTime="2026-03-17 11:26:36.830030489 +0000 UTC m=+899.956158287" Mar 17 11:26:38 crc kubenswrapper[4742]: I0317 11:26:38.824843 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" event={"ID":"a9d77ceb-2194-4bf6-809d-30ebc45c4dba","Type":"ContainerStarted","Data":"1c7fadcc56c8a2e96e0b0fbaa21ac0ff92aa012042deb249bfc5ff60a64d021c"} Mar 17 11:26:38 crc kubenswrapper[4742]: I0317 11:26:38.827170 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b78m6" event={"ID":"15a73401-5a6e-4a32-99ba-4efe8182c160","Type":"ContainerStarted","Data":"5d2d166752a9c9594732706f73debd147114a888ba252883611e9eeb4f1a16d4"} Mar 17 11:26:38 crc kubenswrapper[4742]: I0317 11:26:38.827601 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:38 crc kubenswrapper[4742]: I0317 11:26:38.829307 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb" event={"ID":"b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b","Type":"ContainerStarted","Data":"a75bf1467ae0125808a3d3feba0b5ae85612b821e15811dcb1b66c0b56b84829"} Mar 17 11:26:38 crc kubenswrapper[4742]: I0317 11:26:38.829365 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb" Mar 17 11:26:38 crc kubenswrapper[4742]: I0317 11:26:38.831027 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ttrvv" event={"ID":"36b76368-76e0-42c0-944f-c799a074ff7f","Type":"ContainerStarted","Data":"764a439bda3cc82c89955be6e918592ecf3b09811c1a2ef6a569304de92e0e1b"} Mar 17 11:26:38 crc kubenswrapper[4742]: I0317 11:26:38.848319 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-7wxg4" podStartSLOduration=1.321175504 podStartE2EDuration="3.848292397s" podCreationTimestamp="2026-03-17 11:26:35 +0000 UTC" firstStartedPulling="2026-03-17 11:26:35.844980048 +0000 UTC m=+898.971107806" lastFinishedPulling="2026-03-17 11:26:38.372096931 +0000 UTC m=+901.498224699" observedRunningTime="2026-03-17 11:26:38.838323177 +0000 UTC m=+901.964450965" watchObservedRunningTime="2026-03-17 11:26:38.848292397 +0000 UTC m=+901.974420195" Mar 17 11:26:38 crc kubenswrapper[4742]: I0317 11:26:38.901423 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-b78m6" podStartSLOduration=1.443570919 podStartE2EDuration="3.9013991s" podCreationTimestamp="2026-03-17 11:26:35 +0000 UTC" firstStartedPulling="2026-03-17 11:26:35.921734873 +0000 UTC m=+899.047862641" lastFinishedPulling="2026-03-17 11:26:38.379563054 +0000 UTC m=+901.505690822" observedRunningTime="2026-03-17 11:26:38.895988003 +0000 UTC m=+902.022115801" watchObservedRunningTime="2026-03-17 11:26:38.9013991 +0000 UTC m=+902.027526868" Mar 17 11:26:38 crc kubenswrapper[4742]: I0317 11:26:38.906386 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb" podStartSLOduration=1.3095256 podStartE2EDuration="3.906371206s" podCreationTimestamp="2026-03-17 11:26:35 +0000 UTC" firstStartedPulling="2026-03-17 11:26:35.779340856 +0000 UTC m=+898.905468614" lastFinishedPulling="2026-03-17 11:26:38.376186462 +0000 UTC m=+901.502314220" observedRunningTime="2026-03-17 11:26:38.870614294 +0000 UTC m=+901.996742072" watchObservedRunningTime="2026-03-17 11:26:38.906371206 +0000 UTC m=+902.032498974" Mar 17 11:26:38 crc kubenswrapper[4742]: I0317 11:26:38.931979 4742 container_manager_linux.go:630] "Failed to ensure state" containerName="/system.slice" err="failed to move PID 30351 into the system container \"/system.slice\": " Mar 17 11:26:39 crc kubenswrapper[4742]: I0317 11:26:39.992827 4742 scope.go:117] "RemoveContainer" containerID="ed15874775926665a5f3c5e51e46899cef9e24dcdb55aa758011f2ed5e03a40f" Mar 17 11:26:42 crc kubenswrapper[4742]: I0317 11:26:42.869402 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ttrvv" event={"ID":"36b76368-76e0-42c0-944f-c799a074ff7f","Type":"ContainerStarted","Data":"298f9ab4aa59b742f514fa1f7ec8ade35b1f75d3de4d89d1dd0b2c9ceab334d1"} Mar 17 11:26:45 crc kubenswrapper[4742]: I0317 11:26:45.831040 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:45 crc kubenswrapper[4742]: I0317 11:26:45.831357 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:45 crc kubenswrapper[4742]: I0317 11:26:45.838774 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:45 crc kubenswrapper[4742]: I0317 11:26:45.867358 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-ttrvv" podStartSLOduration=4.027761525 podStartE2EDuration="10.867333341s" podCreationTimestamp="2026-03-17 11:26:35 +0000 UTC" firstStartedPulling="2026-03-17 11:26:35.733194932 +0000 UTC m=+898.859322690" lastFinishedPulling="2026-03-17 11:26:42.572766708 +0000 UTC m=+905.698894506" observedRunningTime="2026-03-17 11:26:42.906743632 +0000 UTC m=+906.032871450" watchObservedRunningTime="2026-03-17 11:26:45.867333341 +0000 UTC m=+908.993461129" Mar 17 11:26:45 crc kubenswrapper[4742]: I0317 11:26:45.900537 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6ffbdfddfb-jgw5h" Mar 17 11:26:45 crc kubenswrapper[4742]: I0317 11:26:45.928793 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-b78m6" Mar 17 11:26:45 crc kubenswrapper[4742]: I0317 11:26:45.987127 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lfdfp"] Mar 17 11:26:55 crc kubenswrapper[4742]: I0317 11:26:55.559463 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cn8bb" Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.315336 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp"] Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.317996 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.321111 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.326875 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp"] Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.437526 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np9pk\" (UniqueName: \"kubernetes.io/projected/8011261b-573f-4e09-894b-0643fba90f8d-kube-api-access-np9pk\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp\" (UID: \"8011261b-573f-4e09-894b-0643fba90f8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.437605 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8011261b-573f-4e09-894b-0643fba90f8d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp\" (UID: \"8011261b-573f-4e09-894b-0643fba90f8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.437637 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8011261b-573f-4e09-894b-0643fba90f8d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp\" (UID: \"8011261b-573f-4e09-894b-0643fba90f8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.538106 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8011261b-573f-4e09-894b-0643fba90f8d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp\" (UID: \"8011261b-573f-4e09-894b-0643fba90f8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.538166 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8011261b-573f-4e09-894b-0643fba90f8d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp\" (UID: \"8011261b-573f-4e09-894b-0643fba90f8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.538267 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np9pk\" (UniqueName: \"kubernetes.io/projected/8011261b-573f-4e09-894b-0643fba90f8d-kube-api-access-np9pk\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp\" (UID: \"8011261b-573f-4e09-894b-0643fba90f8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.539281 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8011261b-573f-4e09-894b-0643fba90f8d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp\" (UID: \"8011261b-573f-4e09-894b-0643fba90f8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.539299 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8011261b-573f-4e09-894b-0643fba90f8d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp\" (UID: \"8011261b-573f-4e09-894b-0643fba90f8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.577103 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np9pk\" (UniqueName: \"kubernetes.io/projected/8011261b-573f-4e09-894b-0643fba90f8d-kube-api-access-np9pk\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp\" (UID: \"8011261b-573f-4e09-894b-0643fba90f8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.639481 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" Mar 17 11:27:09 crc kubenswrapper[4742]: I0317 11:27:09.885021 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp"] Mar 17 11:27:09 crc kubenswrapper[4742]: W0317 11:27:09.892979 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8011261b_573f_4e09_894b_0643fba90f8d.slice/crio-31f721b6c22d36843ef7ccc7b76a7b4fbc84d92a708bb3f3d2880f977ba62caf WatchSource:0}: Error finding container 31f721b6c22d36843ef7ccc7b76a7b4fbc84d92a708bb3f3d2880f977ba62caf: Status 404 returned error can't find the container with id 31f721b6c22d36843ef7ccc7b76a7b4fbc84d92a708bb3f3d2880f977ba62caf Mar 17 11:27:10 crc kubenswrapper[4742]: I0317 11:27:10.054668 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" event={"ID":"8011261b-573f-4e09-894b-0643fba90f8d","Type":"ContainerStarted","Data":"bb23b038052c66f0062827d35e28cae0dbee2e56797bccf8c633c51aacd5bda1"} Mar 17 11:27:10 crc kubenswrapper[4742]: I0317 11:27:10.055149 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" event={"ID":"8011261b-573f-4e09-894b-0643fba90f8d","Type":"ContainerStarted","Data":"31f721b6c22d36843ef7ccc7b76a7b4fbc84d92a708bb3f3d2880f977ba62caf"} Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.055432 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-lfdfp" podUID="dcb66d58-3d7a-47db-b3ff-2ede326cbe34" containerName="console" containerID="cri-o://5aac6a5f0a0ffc2a5d93dc1482f58c6abf35598426d69bd6ee761160b0afb22c" gracePeriod=15 Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.067549 4742 generic.go:334] "Generic (PLEG): container finished" podID="8011261b-573f-4e09-894b-0643fba90f8d" containerID="bb23b038052c66f0062827d35e28cae0dbee2e56797bccf8c633c51aacd5bda1" exitCode=0 Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.067633 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" event={"ID":"8011261b-573f-4e09-894b-0643fba90f8d","Type":"ContainerDied","Data":"bb23b038052c66f0062827d35e28cae0dbee2e56797bccf8c633c51aacd5bda1"} Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.072145 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.402491 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lfdfp_dcb66d58-3d7a-47db-b3ff-2ede326cbe34/console/0.log" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.402851 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.467965 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-oauth-config\") pod \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.468123 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-config\") pod \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.468248 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-trusted-ca-bundle\") pod \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.468303 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-service-ca\") pod \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.468343 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz56r\" (UniqueName: \"kubernetes.io/projected/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-kube-api-access-nz56r\") pod \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.468388 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-serving-cert\") pod \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.468446 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-oauth-serving-cert\") pod \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\" (UID: \"dcb66d58-3d7a-47db-b3ff-2ede326cbe34\") " Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.469011 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-service-ca" (OuterVolumeSpecName: "service-ca") pod "dcb66d58-3d7a-47db-b3ff-2ede326cbe34" (UID: "dcb66d58-3d7a-47db-b3ff-2ede326cbe34"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.469181 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-config" (OuterVolumeSpecName: "console-config") pod "dcb66d58-3d7a-47db-b3ff-2ede326cbe34" (UID: "dcb66d58-3d7a-47db-b3ff-2ede326cbe34"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.469258 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dcb66d58-3d7a-47db-b3ff-2ede326cbe34" (UID: "dcb66d58-3d7a-47db-b3ff-2ede326cbe34"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.469275 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dcb66d58-3d7a-47db-b3ff-2ede326cbe34" (UID: "dcb66d58-3d7a-47db-b3ff-2ede326cbe34"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.475845 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-kube-api-access-nz56r" (OuterVolumeSpecName: "kube-api-access-nz56r") pod "dcb66d58-3d7a-47db-b3ff-2ede326cbe34" (UID: "dcb66d58-3d7a-47db-b3ff-2ede326cbe34"). InnerVolumeSpecName "kube-api-access-nz56r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.475868 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dcb66d58-3d7a-47db-b3ff-2ede326cbe34" (UID: "dcb66d58-3d7a-47db-b3ff-2ede326cbe34"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.476473 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dcb66d58-3d7a-47db-b3ff-2ede326cbe34" (UID: "dcb66d58-3d7a-47db-b3ff-2ede326cbe34"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.570216 4742 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.570274 4742 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.570299 4742 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.570319 4742 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-service-ca\") on node \"crc\" DevicePath \"\"" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.570338 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz56r\" (UniqueName: \"kubernetes.io/projected/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-kube-api-access-nz56r\") on node \"crc\" DevicePath \"\"" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.570358 4742 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:27:11 crc kubenswrapper[4742]: I0317 11:27:11.570375 4742 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dcb66d58-3d7a-47db-b3ff-2ede326cbe34-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 17 11:27:12 crc kubenswrapper[4742]: I0317 11:27:12.080648 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-lfdfp_dcb66d58-3d7a-47db-b3ff-2ede326cbe34/console/0.log" Mar 17 11:27:12 crc kubenswrapper[4742]: I0317 11:27:12.080741 4742 generic.go:334] "Generic (PLEG): container finished" podID="dcb66d58-3d7a-47db-b3ff-2ede326cbe34" containerID="5aac6a5f0a0ffc2a5d93dc1482f58c6abf35598426d69bd6ee761160b0afb22c" exitCode=2 Mar 17 11:27:12 crc kubenswrapper[4742]: I0317 11:27:12.080817 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lfdfp" event={"ID":"dcb66d58-3d7a-47db-b3ff-2ede326cbe34","Type":"ContainerDied","Data":"5aac6a5f0a0ffc2a5d93dc1482f58c6abf35598426d69bd6ee761160b0afb22c"} Mar 17 11:27:12 crc kubenswrapper[4742]: I0317 11:27:12.080867 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-lfdfp" event={"ID":"dcb66d58-3d7a-47db-b3ff-2ede326cbe34","Type":"ContainerDied","Data":"e9d3aae0219dafebe6c09c6791b8113653e912cf485f713fee521bccffa165fc"} Mar 17 11:27:12 crc kubenswrapper[4742]: I0317 11:27:12.080875 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-lfdfp" Mar 17 11:27:12 crc kubenswrapper[4742]: I0317 11:27:12.080895 4742 scope.go:117] "RemoveContainer" containerID="5aac6a5f0a0ffc2a5d93dc1482f58c6abf35598426d69bd6ee761160b0afb22c" Mar 17 11:27:12 crc kubenswrapper[4742]: I0317 11:27:12.113003 4742 scope.go:117] "RemoveContainer" containerID="5aac6a5f0a0ffc2a5d93dc1482f58c6abf35598426d69bd6ee761160b0afb22c" Mar 17 11:27:12 crc kubenswrapper[4742]: E0317 11:27:12.113699 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aac6a5f0a0ffc2a5d93dc1482f58c6abf35598426d69bd6ee761160b0afb22c\": container with ID starting with 5aac6a5f0a0ffc2a5d93dc1482f58c6abf35598426d69bd6ee761160b0afb22c not found: ID does not exist" containerID="5aac6a5f0a0ffc2a5d93dc1482f58c6abf35598426d69bd6ee761160b0afb22c" Mar 17 11:27:12 crc kubenswrapper[4742]: I0317 11:27:12.113753 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aac6a5f0a0ffc2a5d93dc1482f58c6abf35598426d69bd6ee761160b0afb22c"} err="failed to get container status \"5aac6a5f0a0ffc2a5d93dc1482f58c6abf35598426d69bd6ee761160b0afb22c\": rpc error: code = NotFound desc = could not find container \"5aac6a5f0a0ffc2a5d93dc1482f58c6abf35598426d69bd6ee761160b0afb22c\": container with ID starting with 5aac6a5f0a0ffc2a5d93dc1482f58c6abf35598426d69bd6ee761160b0afb22c not found: ID does not exist" Mar 17 11:27:12 crc kubenswrapper[4742]: I0317 11:27:12.141448 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-lfdfp"] Mar 17 11:27:12 crc kubenswrapper[4742]: I0317 11:27:12.151721 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-lfdfp"] Mar 17 11:27:12 crc kubenswrapper[4742]: I0317 11:27:12.679952 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcb66d58-3d7a-47db-b3ff-2ede326cbe34" path="/var/lib/kubelet/pods/dcb66d58-3d7a-47db-b3ff-2ede326cbe34/volumes" Mar 17 11:27:13 crc kubenswrapper[4742]: I0317 11:27:13.088024 4742 generic.go:334] "Generic (PLEG): container finished" podID="8011261b-573f-4e09-894b-0643fba90f8d" containerID="3038f81689208d8b531ae42d21a0d52981a956858b169b4c773484b1d83ca56c" exitCode=0 Mar 17 11:27:13 crc kubenswrapper[4742]: I0317 11:27:13.088070 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" event={"ID":"8011261b-573f-4e09-894b-0643fba90f8d","Type":"ContainerDied","Data":"3038f81689208d8b531ae42d21a0d52981a956858b169b4c773484b1d83ca56c"} Mar 17 11:27:14 crc kubenswrapper[4742]: I0317 11:27:14.099572 4742 generic.go:334] "Generic (PLEG): container finished" podID="8011261b-573f-4e09-894b-0643fba90f8d" containerID="e087c9963842d5a56f206d73acdf09a2081c19ac2a9415ebdcb6bbf2b67b84ea" exitCode=0 Mar 17 11:27:14 crc kubenswrapper[4742]: I0317 11:27:14.099645 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" event={"ID":"8011261b-573f-4e09-894b-0643fba90f8d","Type":"ContainerDied","Data":"e087c9963842d5a56f206d73acdf09a2081c19ac2a9415ebdcb6bbf2b67b84ea"} Mar 17 11:27:15 crc kubenswrapper[4742]: I0317 11:27:15.464785 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" Mar 17 11:27:15 crc kubenswrapper[4742]: I0317 11:27:15.526614 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8011261b-573f-4e09-894b-0643fba90f8d-util\") pod \"8011261b-573f-4e09-894b-0643fba90f8d\" (UID: \"8011261b-573f-4e09-894b-0643fba90f8d\") " Mar 17 11:27:15 crc kubenswrapper[4742]: I0317 11:27:15.526742 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8011261b-573f-4e09-894b-0643fba90f8d-bundle\") pod \"8011261b-573f-4e09-894b-0643fba90f8d\" (UID: \"8011261b-573f-4e09-894b-0643fba90f8d\") " Mar 17 11:27:15 crc kubenswrapper[4742]: I0317 11:27:15.526801 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np9pk\" (UniqueName: \"kubernetes.io/projected/8011261b-573f-4e09-894b-0643fba90f8d-kube-api-access-np9pk\") pod \"8011261b-573f-4e09-894b-0643fba90f8d\" (UID: \"8011261b-573f-4e09-894b-0643fba90f8d\") " Mar 17 11:27:15 crc kubenswrapper[4742]: I0317 11:27:15.528079 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8011261b-573f-4e09-894b-0643fba90f8d-bundle" (OuterVolumeSpecName: "bundle") pod "8011261b-573f-4e09-894b-0643fba90f8d" (UID: "8011261b-573f-4e09-894b-0643fba90f8d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:27:15 crc kubenswrapper[4742]: I0317 11:27:15.544166 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8011261b-573f-4e09-894b-0643fba90f8d-kube-api-access-np9pk" (OuterVolumeSpecName: "kube-api-access-np9pk") pod "8011261b-573f-4e09-894b-0643fba90f8d" (UID: "8011261b-573f-4e09-894b-0643fba90f8d"). InnerVolumeSpecName "kube-api-access-np9pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:27:15 crc kubenswrapper[4742]: I0317 11:27:15.562076 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8011261b-573f-4e09-894b-0643fba90f8d-util" (OuterVolumeSpecName: "util") pod "8011261b-573f-4e09-894b-0643fba90f8d" (UID: "8011261b-573f-4e09-894b-0643fba90f8d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:27:15 crc kubenswrapper[4742]: I0317 11:27:15.628041 4742 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8011261b-573f-4e09-894b-0643fba90f8d-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:27:15 crc kubenswrapper[4742]: I0317 11:27:15.628089 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np9pk\" (UniqueName: \"kubernetes.io/projected/8011261b-573f-4e09-894b-0643fba90f8d-kube-api-access-np9pk\") on node \"crc\" DevicePath \"\"" Mar 17 11:27:15 crc kubenswrapper[4742]: I0317 11:27:15.628110 4742 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8011261b-573f-4e09-894b-0643fba90f8d-util\") on node \"crc\" DevicePath \"\"" Mar 17 11:27:16 crc kubenswrapper[4742]: I0317 11:27:16.119386 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" event={"ID":"8011261b-573f-4e09-894b-0643fba90f8d","Type":"ContainerDied","Data":"31f721b6c22d36843ef7ccc7b76a7b4fbc84d92a708bb3f3d2880f977ba62caf"} Mar 17 11:27:16 crc kubenswrapper[4742]: I0317 11:27:16.119446 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31f721b6c22d36843ef7ccc7b76a7b4fbc84d92a708bb3f3d2880f977ba62caf" Mar 17 11:27:16 crc kubenswrapper[4742]: I0317 11:27:16.119480 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.378571 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf"] Mar 17 11:27:24 crc kubenswrapper[4742]: E0317 11:27:24.379489 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8011261b-573f-4e09-894b-0643fba90f8d" containerName="util" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.379504 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="8011261b-573f-4e09-894b-0643fba90f8d" containerName="util" Mar 17 11:27:24 crc kubenswrapper[4742]: E0317 11:27:24.379526 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8011261b-573f-4e09-894b-0643fba90f8d" containerName="extract" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.379532 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="8011261b-573f-4e09-894b-0643fba90f8d" containerName="extract" Mar 17 11:27:24 crc kubenswrapper[4742]: E0317 11:27:24.379539 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb66d58-3d7a-47db-b3ff-2ede326cbe34" containerName="console" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.379546 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb66d58-3d7a-47db-b3ff-2ede326cbe34" containerName="console" Mar 17 11:27:24 crc kubenswrapper[4742]: E0317 11:27:24.379555 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8011261b-573f-4e09-894b-0643fba90f8d" containerName="pull" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.379560 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="8011261b-573f-4e09-894b-0643fba90f8d" containerName="pull" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.379645 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcb66d58-3d7a-47db-b3ff-2ede326cbe34" containerName="console" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.379661 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="8011261b-573f-4e09-894b-0643fba90f8d" containerName="extract" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.380226 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.382409 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.383014 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.383228 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.383314 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.383361 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5mfgs" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.407987 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf"] Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.446982 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h98bz\" (UniqueName: \"kubernetes.io/projected/f21bd592-6b38-41b3-a6a1-9b782891a659-kube-api-access-h98bz\") pod \"metallb-operator-controller-manager-6cbc4688f7-5wdxf\" (UID: \"f21bd592-6b38-41b3-a6a1-9b782891a659\") " pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.447053 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f21bd592-6b38-41b3-a6a1-9b782891a659-apiservice-cert\") pod \"metallb-operator-controller-manager-6cbc4688f7-5wdxf\" (UID: \"f21bd592-6b38-41b3-a6a1-9b782891a659\") " pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.447092 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f21bd592-6b38-41b3-a6a1-9b782891a659-webhook-cert\") pod \"metallb-operator-controller-manager-6cbc4688f7-5wdxf\" (UID: \"f21bd592-6b38-41b3-a6a1-9b782891a659\") " pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.548387 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h98bz\" (UniqueName: \"kubernetes.io/projected/f21bd592-6b38-41b3-a6a1-9b782891a659-kube-api-access-h98bz\") pod \"metallb-operator-controller-manager-6cbc4688f7-5wdxf\" (UID: \"f21bd592-6b38-41b3-a6a1-9b782891a659\") " pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.548439 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f21bd592-6b38-41b3-a6a1-9b782891a659-apiservice-cert\") pod \"metallb-operator-controller-manager-6cbc4688f7-5wdxf\" (UID: \"f21bd592-6b38-41b3-a6a1-9b782891a659\") " pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.548460 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f21bd592-6b38-41b3-a6a1-9b782891a659-webhook-cert\") pod \"metallb-operator-controller-manager-6cbc4688f7-5wdxf\" (UID: \"f21bd592-6b38-41b3-a6a1-9b782891a659\") " pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.557562 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f21bd592-6b38-41b3-a6a1-9b782891a659-webhook-cert\") pod \"metallb-operator-controller-manager-6cbc4688f7-5wdxf\" (UID: \"f21bd592-6b38-41b3-a6a1-9b782891a659\") " pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.560563 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f21bd592-6b38-41b3-a6a1-9b782891a659-apiservice-cert\") pod \"metallb-operator-controller-manager-6cbc4688f7-5wdxf\" (UID: \"f21bd592-6b38-41b3-a6a1-9b782891a659\") " pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.569659 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h98bz\" (UniqueName: \"kubernetes.io/projected/f21bd592-6b38-41b3-a6a1-9b782891a659-kube-api-access-h98bz\") pod \"metallb-operator-controller-manager-6cbc4688f7-5wdxf\" (UID: \"f21bd592-6b38-41b3-a6a1-9b782891a659\") " pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.629480 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7"] Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.630204 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.636140 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.636188 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.636452 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-6c6jc" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.648892 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7"] Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.699779 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.750719 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e260a39-fc3d-48d3-90f5-151700332db7-apiservice-cert\") pod \"metallb-operator-webhook-server-5df756f8d6-hq5d7\" (UID: \"3e260a39-fc3d-48d3-90f5-151700332db7\") " pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.750790 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5hwd\" (UniqueName: \"kubernetes.io/projected/3e260a39-fc3d-48d3-90f5-151700332db7-kube-api-access-h5hwd\") pod \"metallb-operator-webhook-server-5df756f8d6-hq5d7\" (UID: \"3e260a39-fc3d-48d3-90f5-151700332db7\") " pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.750867 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e260a39-fc3d-48d3-90f5-151700332db7-webhook-cert\") pod \"metallb-operator-webhook-server-5df756f8d6-hq5d7\" (UID: \"3e260a39-fc3d-48d3-90f5-151700332db7\") " pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.852502 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e260a39-fc3d-48d3-90f5-151700332db7-webhook-cert\") pod \"metallb-operator-webhook-server-5df756f8d6-hq5d7\" (UID: \"3e260a39-fc3d-48d3-90f5-151700332db7\") " pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.852817 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e260a39-fc3d-48d3-90f5-151700332db7-apiservice-cert\") pod \"metallb-operator-webhook-server-5df756f8d6-hq5d7\" (UID: \"3e260a39-fc3d-48d3-90f5-151700332db7\") " pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.852847 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5hwd\" (UniqueName: \"kubernetes.io/projected/3e260a39-fc3d-48d3-90f5-151700332db7-kube-api-access-h5hwd\") pod \"metallb-operator-webhook-server-5df756f8d6-hq5d7\" (UID: \"3e260a39-fc3d-48d3-90f5-151700332db7\") " pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.857191 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e260a39-fc3d-48d3-90f5-151700332db7-apiservice-cert\") pod \"metallb-operator-webhook-server-5df756f8d6-hq5d7\" (UID: \"3e260a39-fc3d-48d3-90f5-151700332db7\") " pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.857215 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e260a39-fc3d-48d3-90f5-151700332db7-webhook-cert\") pod \"metallb-operator-webhook-server-5df756f8d6-hq5d7\" (UID: \"3e260a39-fc3d-48d3-90f5-151700332db7\") " pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.882276 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5hwd\" (UniqueName: \"kubernetes.io/projected/3e260a39-fc3d-48d3-90f5-151700332db7-kube-api-access-h5hwd\") pod \"metallb-operator-webhook-server-5df756f8d6-hq5d7\" (UID: \"3e260a39-fc3d-48d3-90f5-151700332db7\") " pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.882556 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf"] Mar 17 11:27:24 crc kubenswrapper[4742]: W0317 11:27:24.891291 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf21bd592_6b38_41b3_a6a1_9b782891a659.slice/crio-48e5bfff878638de21043e829fdf55cb1168325ca2a3dbbc6820c4ef350cff37 WatchSource:0}: Error finding container 48e5bfff878638de21043e829fdf55cb1168325ca2a3dbbc6820c4ef350cff37: Status 404 returned error can't find the container with id 48e5bfff878638de21043e829fdf55cb1168325ca2a3dbbc6820c4ef350cff37 Mar 17 11:27:24 crc kubenswrapper[4742]: I0317 11:27:24.945767 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" Mar 17 11:27:25 crc kubenswrapper[4742]: I0317 11:27:25.176356 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" event={"ID":"f21bd592-6b38-41b3-a6a1-9b782891a659","Type":"ContainerStarted","Data":"48e5bfff878638de21043e829fdf55cb1168325ca2a3dbbc6820c4ef350cff37"} Mar 17 11:27:25 crc kubenswrapper[4742]: I0317 11:27:25.202070 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7"] Mar 17 11:27:25 crc kubenswrapper[4742]: W0317 11:27:25.209195 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e260a39_fc3d_48d3_90f5_151700332db7.slice/crio-32fe9df2fd690c6591978f570ff16daa841afbdfa686ed28a76d3ced2ad336b9 WatchSource:0}: Error finding container 32fe9df2fd690c6591978f570ff16daa841afbdfa686ed28a76d3ced2ad336b9: Status 404 returned error can't find the container with id 32fe9df2fd690c6591978f570ff16daa841afbdfa686ed28a76d3ced2ad336b9 Mar 17 11:27:26 crc kubenswrapper[4742]: I0317 11:27:26.184777 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" event={"ID":"3e260a39-fc3d-48d3-90f5-151700332db7","Type":"ContainerStarted","Data":"32fe9df2fd690c6591978f570ff16daa841afbdfa686ed28a76d3ced2ad336b9"} Mar 17 11:27:29 crc kubenswrapper[4742]: I0317 11:27:29.209933 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" event={"ID":"f21bd592-6b38-41b3-a6a1-9b782891a659","Type":"ContainerStarted","Data":"59fb303941e94c66b8d68307f19beca235fe8d93c5cb93aaf8a16009cdac8cfb"} Mar 17 11:27:29 crc kubenswrapper[4742]: I0317 11:27:29.210499 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" Mar 17 11:27:29 crc kubenswrapper[4742]: I0317 11:27:29.243163 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" podStartSLOduration=1.5864319980000001 podStartE2EDuration="5.243145338s" podCreationTimestamp="2026-03-17 11:27:24 +0000 UTC" firstStartedPulling="2026-03-17 11:27:24.899824205 +0000 UTC m=+948.025951963" lastFinishedPulling="2026-03-17 11:27:28.556537545 +0000 UTC m=+951.682665303" observedRunningTime="2026-03-17 11:27:29.24065713 +0000 UTC m=+952.366784888" watchObservedRunningTime="2026-03-17 11:27:29.243145338 +0000 UTC m=+952.369273096" Mar 17 11:27:31 crc kubenswrapper[4742]: I0317 11:27:31.225424 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" event={"ID":"3e260a39-fc3d-48d3-90f5-151700332db7","Type":"ContainerStarted","Data":"092a5254ac8ece18d634d4f930d84d43c6aa436f938019e6dfc01f6bf536a8af"} Mar 17 11:27:31 crc kubenswrapper[4742]: I0317 11:27:31.226436 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" Mar 17 11:27:31 crc kubenswrapper[4742]: I0317 11:27:31.245867 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" podStartSLOduration=2.104263326 podStartE2EDuration="7.245841234s" podCreationTimestamp="2026-03-17 11:27:24 +0000 UTC" firstStartedPulling="2026-03-17 11:27:25.212350915 +0000 UTC m=+948.338478673" lastFinishedPulling="2026-03-17 11:27:30.353928803 +0000 UTC m=+953.480056581" observedRunningTime="2026-03-17 11:27:31.243398668 +0000 UTC m=+954.369526436" watchObservedRunningTime="2026-03-17 11:27:31.245841234 +0000 UTC m=+954.371969032" Mar 17 11:27:44 crc kubenswrapper[4742]: I0317 11:27:44.953886 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5df756f8d6-hq5d7" Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.116009 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m4mb4"] Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.118108 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.126436 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m4mb4"] Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.177788 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-catalog-content\") pod \"certified-operators-m4mb4\" (UID: \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\") " pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.177849 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-utilities\") pod \"certified-operators-m4mb4\" (UID: \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\") " pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.177946 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82c2v\" (UniqueName: \"kubernetes.io/projected/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-kube-api-access-82c2v\") pod \"certified-operators-m4mb4\" (UID: \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\") " pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.279351 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82c2v\" (UniqueName: \"kubernetes.io/projected/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-kube-api-access-82c2v\") pod \"certified-operators-m4mb4\" (UID: \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\") " pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.279417 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-catalog-content\") pod \"certified-operators-m4mb4\" (UID: \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\") " pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.279451 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-utilities\") pod \"certified-operators-m4mb4\" (UID: \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\") " pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.279860 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-utilities\") pod \"certified-operators-m4mb4\" (UID: \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\") " pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.280040 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-catalog-content\") pod \"certified-operators-m4mb4\" (UID: \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\") " pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.301797 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82c2v\" (UniqueName: \"kubernetes.io/projected/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-kube-api-access-82c2v\") pod \"certified-operators-m4mb4\" (UID: \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\") " pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.447074 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:27:49 crc kubenswrapper[4742]: I0317 11:27:49.786991 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m4mb4"] Mar 17 11:27:50 crc kubenswrapper[4742]: I0317 11:27:50.342966 4742 generic.go:334] "Generic (PLEG): container finished" podID="0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" containerID="d7c63a86236ff34e5c8a2a05409ce9342d9a47af304eea0097feb51f2e675ccb" exitCode=0 Mar 17 11:27:50 crc kubenswrapper[4742]: I0317 11:27:50.343075 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4mb4" event={"ID":"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b","Type":"ContainerDied","Data":"d7c63a86236ff34e5c8a2a05409ce9342d9a47af304eea0097feb51f2e675ccb"} Mar 17 11:27:50 crc kubenswrapper[4742]: I0317 11:27:50.344218 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4mb4" event={"ID":"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b","Type":"ContainerStarted","Data":"783346426cf83b1e57ead47bfb73af803caf2cad89716be3b9af27772ce30d0b"} Mar 17 11:27:51 crc kubenswrapper[4742]: I0317 11:27:51.355475 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4mb4" event={"ID":"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b","Type":"ContainerStarted","Data":"59eb85f769f2b822c0b31ca781080aeb80e224b7ff496cc5dc04f2f9aa0b6da1"} Mar 17 11:27:52 crc kubenswrapper[4742]: I0317 11:27:52.363526 4742 generic.go:334] "Generic (PLEG): container finished" podID="0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" containerID="59eb85f769f2b822c0b31ca781080aeb80e224b7ff496cc5dc04f2f9aa0b6da1" exitCode=0 Mar 17 11:27:52 crc kubenswrapper[4742]: I0317 11:27:52.363643 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4mb4" event={"ID":"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b","Type":"ContainerDied","Data":"59eb85f769f2b822c0b31ca781080aeb80e224b7ff496cc5dc04f2f9aa0b6da1"} Mar 17 11:27:53 crc kubenswrapper[4742]: I0317 11:27:53.384739 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4mb4" event={"ID":"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b","Type":"ContainerStarted","Data":"09316dcc442844265250a4fd7931f22f6c7909293c6c97e8744d11a51441e426"} Mar 17 11:27:53 crc kubenswrapper[4742]: I0317 11:27:53.412096 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m4mb4" podStartSLOduration=1.952192325 podStartE2EDuration="4.412071695s" podCreationTimestamp="2026-03-17 11:27:49 +0000 UTC" firstStartedPulling="2026-03-17 11:27:50.344453543 +0000 UTC m=+973.470581301" lastFinishedPulling="2026-03-17 11:27:52.804332893 +0000 UTC m=+975.930460671" observedRunningTime="2026-03-17 11:27:53.409558205 +0000 UTC m=+976.535686003" watchObservedRunningTime="2026-03-17 11:27:53.412071695 +0000 UTC m=+976.538199493" Mar 17 11:27:59 crc kubenswrapper[4742]: I0317 11:27:59.447348 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:27:59 crc kubenswrapper[4742]: I0317 11:27:59.447764 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:27:59 crc kubenswrapper[4742]: I0317 11:27:59.504438 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:28:00 crc kubenswrapper[4742]: I0317 11:28:00.146001 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562448-4pbmf"] Mar 17 11:28:00 crc kubenswrapper[4742]: I0317 11:28:00.146771 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562448-4pbmf" Mar 17 11:28:00 crc kubenswrapper[4742]: I0317 11:28:00.150243 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:28:00 crc kubenswrapper[4742]: I0317 11:28:00.150430 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:28:00 crc kubenswrapper[4742]: I0317 11:28:00.157568 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:28:00 crc kubenswrapper[4742]: I0317 11:28:00.158442 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swn9f\" (UniqueName: \"kubernetes.io/projected/6a9d6abc-5eec-4122-9f35-665d934ff0ff-kube-api-access-swn9f\") pod \"auto-csr-approver-29562448-4pbmf\" (UID: \"6a9d6abc-5eec-4122-9f35-665d934ff0ff\") " pod="openshift-infra/auto-csr-approver-29562448-4pbmf" Mar 17 11:28:00 crc kubenswrapper[4742]: I0317 11:28:00.163502 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562448-4pbmf"] Mar 17 11:28:00 crc kubenswrapper[4742]: I0317 11:28:00.259233 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swn9f\" (UniqueName: \"kubernetes.io/projected/6a9d6abc-5eec-4122-9f35-665d934ff0ff-kube-api-access-swn9f\") pod \"auto-csr-approver-29562448-4pbmf\" (UID: \"6a9d6abc-5eec-4122-9f35-665d934ff0ff\") " pod="openshift-infra/auto-csr-approver-29562448-4pbmf" Mar 17 11:28:00 crc kubenswrapper[4742]: I0317 11:28:00.289039 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swn9f\" (UniqueName: \"kubernetes.io/projected/6a9d6abc-5eec-4122-9f35-665d934ff0ff-kube-api-access-swn9f\") pod \"auto-csr-approver-29562448-4pbmf\" (UID: \"6a9d6abc-5eec-4122-9f35-665d934ff0ff\") " pod="openshift-infra/auto-csr-approver-29562448-4pbmf" Mar 17 11:28:00 crc kubenswrapper[4742]: I0317 11:28:00.473607 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562448-4pbmf" Mar 17 11:28:00 crc kubenswrapper[4742]: I0317 11:28:00.505024 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:28:00 crc kubenswrapper[4742]: I0317 11:28:00.573280 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m4mb4"] Mar 17 11:28:00 crc kubenswrapper[4742]: I0317 11:28:00.927842 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562448-4pbmf"] Mar 17 11:28:01 crc kubenswrapper[4742]: I0317 11:28:01.441827 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562448-4pbmf" event={"ID":"6a9d6abc-5eec-4122-9f35-665d934ff0ff","Type":"ContainerStarted","Data":"46a68ad99a39967d9f69dfc7be0553a0a0ffb4529a95f112a8523c402edcbf37"} Mar 17 11:28:02 crc kubenswrapper[4742]: I0317 11:28:02.454079 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562448-4pbmf" event={"ID":"6a9d6abc-5eec-4122-9f35-665d934ff0ff","Type":"ContainerStarted","Data":"aba1e2013cc35d6dd6c50570b0324114bfdcc8cd54dc6f804c76a5cc8e0c5862"} Mar 17 11:28:02 crc kubenswrapper[4742]: I0317 11:28:02.454342 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m4mb4" podUID="0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" containerName="registry-server" containerID="cri-o://09316dcc442844265250a4fd7931f22f6c7909293c6c97e8744d11a51441e426" gracePeriod=2 Mar 17 11:28:02 crc kubenswrapper[4742]: I0317 11:28:02.833448 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:28:02 crc kubenswrapper[4742]: I0317 11:28:02.896680 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-catalog-content\") pod \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\" (UID: \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\") " Mar 17 11:28:02 crc kubenswrapper[4742]: I0317 11:28:02.896742 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-utilities\") pod \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\" (UID: \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\") " Mar 17 11:28:02 crc kubenswrapper[4742]: I0317 11:28:02.896796 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82c2v\" (UniqueName: \"kubernetes.io/projected/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-kube-api-access-82c2v\") pod \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\" (UID: \"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b\") " Mar 17 11:28:02 crc kubenswrapper[4742]: I0317 11:28:02.898467 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-utilities" (OuterVolumeSpecName: "utilities") pod "0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" (UID: "0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:28:02 crc kubenswrapper[4742]: I0317 11:28:02.898817 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:28:02 crc kubenswrapper[4742]: I0317 11:28:02.903064 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-kube-api-access-82c2v" (OuterVolumeSpecName: "kube-api-access-82c2v") pod "0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" (UID: "0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b"). InnerVolumeSpecName "kube-api-access-82c2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:28:02 crc kubenswrapper[4742]: I0317 11:28:02.961852 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" (UID: "0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:28:02 crc kubenswrapper[4742]: I0317 11:28:02.999661 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82c2v\" (UniqueName: \"kubernetes.io/projected/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-kube-api-access-82c2v\") on node \"crc\" DevicePath \"\"" Mar 17 11:28:02 crc kubenswrapper[4742]: I0317 11:28:02.999697 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.466104 4742 generic.go:334] "Generic (PLEG): container finished" podID="6a9d6abc-5eec-4122-9f35-665d934ff0ff" containerID="aba1e2013cc35d6dd6c50570b0324114bfdcc8cd54dc6f804c76a5cc8e0c5862" exitCode=0 Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.466213 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562448-4pbmf" event={"ID":"6a9d6abc-5eec-4122-9f35-665d934ff0ff","Type":"ContainerDied","Data":"aba1e2013cc35d6dd6c50570b0324114bfdcc8cd54dc6f804c76a5cc8e0c5862"} Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.471659 4742 generic.go:334] "Generic (PLEG): container finished" podID="0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" containerID="09316dcc442844265250a4fd7931f22f6c7909293c6c97e8744d11a51441e426" exitCode=0 Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.471723 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4mb4" event={"ID":"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b","Type":"ContainerDied","Data":"09316dcc442844265250a4fd7931f22f6c7909293c6c97e8744d11a51441e426"} Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.471799 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m4mb4" event={"ID":"0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b","Type":"ContainerDied","Data":"783346426cf83b1e57ead47bfb73af803caf2cad89716be3b9af27772ce30d0b"} Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.471800 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m4mb4" Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.471833 4742 scope.go:117] "RemoveContainer" containerID="09316dcc442844265250a4fd7931f22f6c7909293c6c97e8744d11a51441e426" Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.503479 4742 scope.go:117] "RemoveContainer" containerID="59eb85f769f2b822c0b31ca781080aeb80e224b7ff496cc5dc04f2f9aa0b6da1" Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.525368 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m4mb4"] Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.531966 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m4mb4"] Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.549992 4742 scope.go:117] "RemoveContainer" containerID="d7c63a86236ff34e5c8a2a05409ce9342d9a47af304eea0097feb51f2e675ccb" Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.570059 4742 scope.go:117] "RemoveContainer" containerID="09316dcc442844265250a4fd7931f22f6c7909293c6c97e8744d11a51441e426" Mar 17 11:28:03 crc kubenswrapper[4742]: E0317 11:28:03.571843 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09316dcc442844265250a4fd7931f22f6c7909293c6c97e8744d11a51441e426\": container with ID starting with 09316dcc442844265250a4fd7931f22f6c7909293c6c97e8744d11a51441e426 not found: ID does not exist" containerID="09316dcc442844265250a4fd7931f22f6c7909293c6c97e8744d11a51441e426" Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.571945 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09316dcc442844265250a4fd7931f22f6c7909293c6c97e8744d11a51441e426"} err="failed to get container status \"09316dcc442844265250a4fd7931f22f6c7909293c6c97e8744d11a51441e426\": rpc error: code = NotFound desc = could not find container \"09316dcc442844265250a4fd7931f22f6c7909293c6c97e8744d11a51441e426\": container with ID starting with 09316dcc442844265250a4fd7931f22f6c7909293c6c97e8744d11a51441e426 not found: ID does not exist" Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.571997 4742 scope.go:117] "RemoveContainer" containerID="59eb85f769f2b822c0b31ca781080aeb80e224b7ff496cc5dc04f2f9aa0b6da1" Mar 17 11:28:03 crc kubenswrapper[4742]: E0317 11:28:03.572529 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59eb85f769f2b822c0b31ca781080aeb80e224b7ff496cc5dc04f2f9aa0b6da1\": container with ID starting with 59eb85f769f2b822c0b31ca781080aeb80e224b7ff496cc5dc04f2f9aa0b6da1 not found: ID does not exist" containerID="59eb85f769f2b822c0b31ca781080aeb80e224b7ff496cc5dc04f2f9aa0b6da1" Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.572587 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59eb85f769f2b822c0b31ca781080aeb80e224b7ff496cc5dc04f2f9aa0b6da1"} err="failed to get container status \"59eb85f769f2b822c0b31ca781080aeb80e224b7ff496cc5dc04f2f9aa0b6da1\": rpc error: code = NotFound desc = could not find container \"59eb85f769f2b822c0b31ca781080aeb80e224b7ff496cc5dc04f2f9aa0b6da1\": container with ID starting with 59eb85f769f2b822c0b31ca781080aeb80e224b7ff496cc5dc04f2f9aa0b6da1 not found: ID does not exist" Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.572621 4742 scope.go:117] "RemoveContainer" containerID="d7c63a86236ff34e5c8a2a05409ce9342d9a47af304eea0097feb51f2e675ccb" Mar 17 11:28:03 crc kubenswrapper[4742]: E0317 11:28:03.574233 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c63a86236ff34e5c8a2a05409ce9342d9a47af304eea0097feb51f2e675ccb\": container with ID starting with d7c63a86236ff34e5c8a2a05409ce9342d9a47af304eea0097feb51f2e675ccb not found: ID does not exist" containerID="d7c63a86236ff34e5c8a2a05409ce9342d9a47af304eea0097feb51f2e675ccb" Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.574298 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c63a86236ff34e5c8a2a05409ce9342d9a47af304eea0097feb51f2e675ccb"} err="failed to get container status \"d7c63a86236ff34e5c8a2a05409ce9342d9a47af304eea0097feb51f2e675ccb\": rpc error: code = NotFound desc = could not find container \"d7c63a86236ff34e5c8a2a05409ce9342d9a47af304eea0097feb51f2e675ccb\": container with ID starting with d7c63a86236ff34e5c8a2a05409ce9342d9a47af304eea0097feb51f2e675ccb not found: ID does not exist" Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.728560 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562448-4pbmf" Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.810129 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swn9f\" (UniqueName: \"kubernetes.io/projected/6a9d6abc-5eec-4122-9f35-665d934ff0ff-kube-api-access-swn9f\") pod \"6a9d6abc-5eec-4122-9f35-665d934ff0ff\" (UID: \"6a9d6abc-5eec-4122-9f35-665d934ff0ff\") " Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.812877 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9d6abc-5eec-4122-9f35-665d934ff0ff-kube-api-access-swn9f" (OuterVolumeSpecName: "kube-api-access-swn9f") pod "6a9d6abc-5eec-4122-9f35-665d934ff0ff" (UID: "6a9d6abc-5eec-4122-9f35-665d934ff0ff"). InnerVolumeSpecName "kube-api-access-swn9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:28:03 crc kubenswrapper[4742]: I0317 11:28:03.911547 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swn9f\" (UniqueName: \"kubernetes.io/projected/6a9d6abc-5eec-4122-9f35-665d934ff0ff-kube-api-access-swn9f\") on node \"crc\" DevicePath \"\"" Mar 17 11:28:04 crc kubenswrapper[4742]: I0317 11:28:04.482712 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562448-4pbmf" event={"ID":"6a9d6abc-5eec-4122-9f35-665d934ff0ff","Type":"ContainerDied","Data":"46a68ad99a39967d9f69dfc7be0553a0a0ffb4529a95f112a8523c402edcbf37"} Mar 17 11:28:04 crc kubenswrapper[4742]: I0317 11:28:04.483320 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46a68ad99a39967d9f69dfc7be0553a0a0ffb4529a95f112a8523c402edcbf37" Mar 17 11:28:04 crc kubenswrapper[4742]: I0317 11:28:04.482791 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562448-4pbmf" Mar 17 11:28:04 crc kubenswrapper[4742]: I0317 11:28:04.669107 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" path="/var/lib/kubelet/pods/0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b/volumes" Mar 17 11:28:04 crc kubenswrapper[4742]: I0317 11:28:04.702373 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6cbc4688f7-5wdxf" Mar 17 11:28:04 crc kubenswrapper[4742]: I0317 11:28:04.781464 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562442-cnzcc"] Mar 17 11:28:04 crc kubenswrapper[4742]: I0317 11:28:04.785962 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562442-cnzcc"] Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.542082 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-8rkdr"] Mar 17 11:28:05 crc kubenswrapper[4742]: E0317 11:28:05.542365 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" containerName="extract-content" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.542382 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" containerName="extract-content" Mar 17 11:28:05 crc kubenswrapper[4742]: E0317 11:28:05.542394 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" containerName="extract-utilities" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.542402 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" containerName="extract-utilities" Mar 17 11:28:05 crc kubenswrapper[4742]: E0317 11:28:05.542413 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9d6abc-5eec-4122-9f35-665d934ff0ff" containerName="oc" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.542421 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9d6abc-5eec-4122-9f35-665d934ff0ff" containerName="oc" Mar 17 11:28:05 crc kubenswrapper[4742]: E0317 11:28:05.542434 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" containerName="registry-server" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.542441 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" containerName="registry-server" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.542568 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9d6abc-5eec-4122-9f35-665d934ff0ff" containerName="oc" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.542585 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ed63af0-ccc9-4da7-a0a9-623ae52bfe1b" containerName="registry-server" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.544754 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.547475 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-snkc8" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.547954 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.548168 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.565611 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6"] Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.566632 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.570162 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.579455 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6"] Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.634028 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjwbn\" (UniqueName: \"kubernetes.io/projected/e890c085-704d-45c9-9166-3d27780a18f6-kube-api-access-vjwbn\") pod \"frr-k8s-webhook-server-bcc4b6f68-pfql6\" (UID: \"e890c085-704d-45c9-9166-3d27780a18f6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.634108 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/11909043-e311-4bf8-9ecf-8b3d33d2584a-frr-sockets\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.634149 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb2hg\" (UniqueName: \"kubernetes.io/projected/11909043-e311-4bf8-9ecf-8b3d33d2584a-kube-api-access-vb2hg\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.634177 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/11909043-e311-4bf8-9ecf-8b3d33d2584a-reloader\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.634200 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/11909043-e311-4bf8-9ecf-8b3d33d2584a-metrics\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.634221 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e890c085-704d-45c9-9166-3d27780a18f6-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pfql6\" (UID: \"e890c085-704d-45c9-9166-3d27780a18f6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.634392 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/11909043-e311-4bf8-9ecf-8b3d33d2584a-frr-startup\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.634446 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/11909043-e311-4bf8-9ecf-8b3d33d2584a-frr-conf\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.634506 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11909043-e311-4bf8-9ecf-8b3d33d2584a-metrics-certs\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.649765 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-67kh2"] Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.651459 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-67kh2" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.655202 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.655231 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.655319 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-rghlr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.655388 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.663298 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-497xk"] Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.664650 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-497xk" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.666954 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.689214 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-497xk"] Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.735197 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80e4c493-69b8-4854-b25a-5126fd02720e-metrics-certs\") pod \"controller-7bb4cc7c98-497xk\" (UID: \"80e4c493-69b8-4854-b25a-5126fd02720e\") " pod="metallb-system/controller-7bb4cc7c98-497xk" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.735506 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/11909043-e311-4bf8-9ecf-8b3d33d2584a-frr-startup\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.735595 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f0349b48-f18d-415d-bb8c-2ee11d489f9e-metallb-excludel2\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.735679 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/11909043-e311-4bf8-9ecf-8b3d33d2584a-frr-conf\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.735760 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11909043-e311-4bf8-9ecf-8b3d33d2584a-metrics-certs\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.735853 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0349b48-f18d-415d-bb8c-2ee11d489f9e-metrics-certs\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.735967 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfmdz\" (UniqueName: \"kubernetes.io/projected/f0349b48-f18d-415d-bb8c-2ee11d489f9e-kube-api-access-vfmdz\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.736057 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f0349b48-f18d-415d-bb8c-2ee11d489f9e-memberlist\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.736141 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6g55\" (UniqueName: \"kubernetes.io/projected/80e4c493-69b8-4854-b25a-5126fd02720e-kube-api-access-h6g55\") pod \"controller-7bb4cc7c98-497xk\" (UID: \"80e4c493-69b8-4854-b25a-5126fd02720e\") " pod="metallb-system/controller-7bb4cc7c98-497xk" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.736439 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjwbn\" (UniqueName: \"kubernetes.io/projected/e890c085-704d-45c9-9166-3d27780a18f6-kube-api-access-vjwbn\") pod \"frr-k8s-webhook-server-bcc4b6f68-pfql6\" (UID: \"e890c085-704d-45c9-9166-3d27780a18f6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.736521 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/11909043-e311-4bf8-9ecf-8b3d33d2584a-frr-sockets\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.736613 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb2hg\" (UniqueName: \"kubernetes.io/projected/11909043-e311-4bf8-9ecf-8b3d33d2584a-kube-api-access-vb2hg\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.736688 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/11909043-e311-4bf8-9ecf-8b3d33d2584a-reloader\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.736762 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/11909043-e311-4bf8-9ecf-8b3d33d2584a-metrics\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.737057 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e890c085-704d-45c9-9166-3d27780a18f6-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pfql6\" (UID: \"e890c085-704d-45c9-9166-3d27780a18f6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.737171 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80e4c493-69b8-4854-b25a-5126fd02720e-cert\") pod \"controller-7bb4cc7c98-497xk\" (UID: \"80e4c493-69b8-4854-b25a-5126fd02720e\") " pod="metallb-system/controller-7bb4cc7c98-497xk" Mar 17 11:28:05 crc kubenswrapper[4742]: E0317 11:28:05.738610 4742 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 17 11:28:05 crc kubenswrapper[4742]: E0317 11:28:05.738680 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e890c085-704d-45c9-9166-3d27780a18f6-cert podName:e890c085-704d-45c9-9166-3d27780a18f6 nodeName:}" failed. No retries permitted until 2026-03-17 11:28:06.23866376 +0000 UTC m=+989.364791518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e890c085-704d-45c9-9166-3d27780a18f6-cert") pod "frr-k8s-webhook-server-bcc4b6f68-pfql6" (UID: "e890c085-704d-45c9-9166-3d27780a18f6") : secret "frr-k8s-webhook-server-cert" not found Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.738747 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/11909043-e311-4bf8-9ecf-8b3d33d2584a-frr-conf\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.739044 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/11909043-e311-4bf8-9ecf-8b3d33d2584a-frr-startup\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.739118 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/11909043-e311-4bf8-9ecf-8b3d33d2584a-reloader\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.739268 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/11909043-e311-4bf8-9ecf-8b3d33d2584a-metrics\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.739824 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/11909043-e311-4bf8-9ecf-8b3d33d2584a-frr-sockets\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.744077 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11909043-e311-4bf8-9ecf-8b3d33d2584a-metrics-certs\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.755701 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb2hg\" (UniqueName: \"kubernetes.io/projected/11909043-e311-4bf8-9ecf-8b3d33d2584a-kube-api-access-vb2hg\") pod \"frr-k8s-8rkdr\" (UID: \"11909043-e311-4bf8-9ecf-8b3d33d2584a\") " pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.757546 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjwbn\" (UniqueName: \"kubernetes.io/projected/e890c085-704d-45c9-9166-3d27780a18f6-kube-api-access-vjwbn\") pod \"frr-k8s-webhook-server-bcc4b6f68-pfql6\" (UID: \"e890c085-704d-45c9-9166-3d27780a18f6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.838372 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0349b48-f18d-415d-bb8c-2ee11d489f9e-metrics-certs\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.838432 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfmdz\" (UniqueName: \"kubernetes.io/projected/f0349b48-f18d-415d-bb8c-2ee11d489f9e-kube-api-access-vfmdz\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.838459 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f0349b48-f18d-415d-bb8c-2ee11d489f9e-memberlist\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.838477 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6g55\" (UniqueName: \"kubernetes.io/projected/80e4c493-69b8-4854-b25a-5126fd02720e-kube-api-access-h6g55\") pod \"controller-7bb4cc7c98-497xk\" (UID: \"80e4c493-69b8-4854-b25a-5126fd02720e\") " pod="metallb-system/controller-7bb4cc7c98-497xk" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.838525 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80e4c493-69b8-4854-b25a-5126fd02720e-cert\") pod \"controller-7bb4cc7c98-497xk\" (UID: \"80e4c493-69b8-4854-b25a-5126fd02720e\") " pod="metallb-system/controller-7bb4cc7c98-497xk" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.838556 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80e4c493-69b8-4854-b25a-5126fd02720e-metrics-certs\") pod \"controller-7bb4cc7c98-497xk\" (UID: \"80e4c493-69b8-4854-b25a-5126fd02720e\") " pod="metallb-system/controller-7bb4cc7c98-497xk" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.838578 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f0349b48-f18d-415d-bb8c-2ee11d489f9e-metallb-excludel2\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.839178 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/f0349b48-f18d-415d-bb8c-2ee11d489f9e-metallb-excludel2\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:05 crc kubenswrapper[4742]: E0317 11:28:05.839386 4742 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 17 11:28:05 crc kubenswrapper[4742]: E0317 11:28:05.839452 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0349b48-f18d-415d-bb8c-2ee11d489f9e-memberlist podName:f0349b48-f18d-415d-bb8c-2ee11d489f9e nodeName:}" failed. No retries permitted until 2026-03-17 11:28:06.339430759 +0000 UTC m=+989.465558617 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f0349b48-f18d-415d-bb8c-2ee11d489f9e-memberlist") pod "speaker-67kh2" (UID: "f0349b48-f18d-415d-bb8c-2ee11d489f9e") : secret "metallb-memberlist" not found Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.841633 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80e4c493-69b8-4854-b25a-5126fd02720e-metrics-certs\") pod \"controller-7bb4cc7c98-497xk\" (UID: \"80e4c493-69b8-4854-b25a-5126fd02720e\") " pod="metallb-system/controller-7bb4cc7c98-497xk" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.841894 4742 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.842337 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f0349b48-f18d-415d-bb8c-2ee11d489f9e-metrics-certs\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.852795 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80e4c493-69b8-4854-b25a-5126fd02720e-cert\") pod \"controller-7bb4cc7c98-497xk\" (UID: \"80e4c493-69b8-4854-b25a-5126fd02720e\") " pod="metallb-system/controller-7bb4cc7c98-497xk" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.858078 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfmdz\" (UniqueName: \"kubernetes.io/projected/f0349b48-f18d-415d-bb8c-2ee11d489f9e-kube-api-access-vfmdz\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.859532 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6g55\" (UniqueName: \"kubernetes.io/projected/80e4c493-69b8-4854-b25a-5126fd02720e-kube-api-access-h6g55\") pod \"controller-7bb4cc7c98-497xk\" (UID: \"80e4c493-69b8-4854-b25a-5126fd02720e\") " pod="metallb-system/controller-7bb4cc7c98-497xk" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.862370 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:05 crc kubenswrapper[4742]: I0317 11:28:05.984417 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-497xk" Mar 17 11:28:06 crc kubenswrapper[4742]: I0317 11:28:06.246115 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e890c085-704d-45c9-9166-3d27780a18f6-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pfql6\" (UID: \"e890c085-704d-45c9-9166-3d27780a18f6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" Mar 17 11:28:06 crc kubenswrapper[4742]: I0317 11:28:06.250984 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e890c085-704d-45c9-9166-3d27780a18f6-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-pfql6\" (UID: \"e890c085-704d-45c9-9166-3d27780a18f6\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" Mar 17 11:28:06 crc kubenswrapper[4742]: I0317 11:28:06.347641 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f0349b48-f18d-415d-bb8c-2ee11d489f9e-memberlist\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:06 crc kubenswrapper[4742]: E0317 11:28:06.347790 4742 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 17 11:28:06 crc kubenswrapper[4742]: E0317 11:28:06.347887 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0349b48-f18d-415d-bb8c-2ee11d489f9e-memberlist podName:f0349b48-f18d-415d-bb8c-2ee11d489f9e nodeName:}" failed. No retries permitted until 2026-03-17 11:28:07.347863832 +0000 UTC m=+990.473991620 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/f0349b48-f18d-415d-bb8c-2ee11d489f9e-memberlist") pod "speaker-67kh2" (UID: "f0349b48-f18d-415d-bb8c-2ee11d489f9e") : secret "metallb-memberlist" not found Mar 17 11:28:06 crc kubenswrapper[4742]: I0317 11:28:06.407135 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-497xk"] Mar 17 11:28:06 crc kubenswrapper[4742]: W0317 11:28:06.417541 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80e4c493_69b8_4854_b25a_5126fd02720e.slice/crio-e248b6a9dd4ac1c51eca9b91f65fa55a5f32b7d58051a0b7e238f0d73ae90092 WatchSource:0}: Error finding container e248b6a9dd4ac1c51eca9b91f65fa55a5f32b7d58051a0b7e238f0d73ae90092: Status 404 returned error can't find the container with id e248b6a9dd4ac1c51eca9b91f65fa55a5f32b7d58051a0b7e238f0d73ae90092 Mar 17 11:28:06 crc kubenswrapper[4742]: I0317 11:28:06.477923 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" Mar 17 11:28:06 crc kubenswrapper[4742]: I0317 11:28:06.507878 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-497xk" event={"ID":"80e4c493-69b8-4854-b25a-5126fd02720e","Type":"ContainerStarted","Data":"e248b6a9dd4ac1c51eca9b91f65fa55a5f32b7d58051a0b7e238f0d73ae90092"} Mar 17 11:28:06 crc kubenswrapper[4742]: I0317 11:28:06.508927 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8rkdr" event={"ID":"11909043-e311-4bf8-9ecf-8b3d33d2584a","Type":"ContainerStarted","Data":"ebc3e2d1b76fd21cd2ae7a5b2f9c1ea82498cb80c2943c9034bcb40672e3049c"} Mar 17 11:28:06 crc kubenswrapper[4742]: I0317 11:28:06.678391 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9ad19b9-d849-4cb6-9ac9-3a35f9de9927" path="/var/lib/kubelet/pods/d9ad19b9-d849-4cb6-9ac9-3a35f9de9927/volumes" Mar 17 11:28:06 crc kubenswrapper[4742]: I0317 11:28:06.964817 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6"] Mar 17 11:28:06 crc kubenswrapper[4742]: W0317 11:28:06.977773 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode890c085_704d_45c9_9166_3d27780a18f6.slice/crio-0941f2d91e306627e3edd71c4c1502da66bbd0b3359e061e1e4e6d66bc660b15 WatchSource:0}: Error finding container 0941f2d91e306627e3edd71c4c1502da66bbd0b3359e061e1e4e6d66bc660b15: Status 404 returned error can't find the container with id 0941f2d91e306627e3edd71c4c1502da66bbd0b3359e061e1e4e6d66bc660b15 Mar 17 11:28:07 crc kubenswrapper[4742]: I0317 11:28:07.363255 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f0349b48-f18d-415d-bb8c-2ee11d489f9e-memberlist\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:07 crc kubenswrapper[4742]: I0317 11:28:07.370539 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/f0349b48-f18d-415d-bb8c-2ee11d489f9e-memberlist\") pod \"speaker-67kh2\" (UID: \"f0349b48-f18d-415d-bb8c-2ee11d489f9e\") " pod="metallb-system/speaker-67kh2" Mar 17 11:28:07 crc kubenswrapper[4742]: I0317 11:28:07.470490 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-67kh2" Mar 17 11:28:07 crc kubenswrapper[4742]: I0317 11:28:07.516923 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-67kh2" event={"ID":"f0349b48-f18d-415d-bb8c-2ee11d489f9e","Type":"ContainerStarted","Data":"7e617709cc5d632d5ae8941c494d515cac058c4e1df26c9357d4c1d7bc24ea14"} Mar 17 11:28:07 crc kubenswrapper[4742]: I0317 11:28:07.519176 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" event={"ID":"e890c085-704d-45c9-9166-3d27780a18f6","Type":"ContainerStarted","Data":"0941f2d91e306627e3edd71c4c1502da66bbd0b3359e061e1e4e6d66bc660b15"} Mar 17 11:28:07 crc kubenswrapper[4742]: I0317 11:28:07.520817 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-497xk" event={"ID":"80e4c493-69b8-4854-b25a-5126fd02720e","Type":"ContainerStarted","Data":"06b7a1c4a1491f817b129f2fe8fb4f48907b74ee9791b56f47f11a4b8ef97c3a"} Mar 17 11:28:07 crc kubenswrapper[4742]: I0317 11:28:07.520843 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-497xk" event={"ID":"80e4c493-69b8-4854-b25a-5126fd02720e","Type":"ContainerStarted","Data":"e9e909b45559c644664b24e092f082aa2d3f62e9548a027c4e356381b1abf8a4"} Mar 17 11:28:07 crc kubenswrapper[4742]: I0317 11:28:07.521090 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-497xk" Mar 17 11:28:07 crc kubenswrapper[4742]: I0317 11:28:07.541090 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-497xk" podStartSLOduration=2.541074596 podStartE2EDuration="2.541074596s" podCreationTimestamp="2026-03-17 11:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:28:07.537551498 +0000 UTC m=+990.663679276" watchObservedRunningTime="2026-03-17 11:28:07.541074596 +0000 UTC m=+990.667202354" Mar 17 11:28:08 crc kubenswrapper[4742]: I0317 11:28:08.528494 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-67kh2" event={"ID":"f0349b48-f18d-415d-bb8c-2ee11d489f9e","Type":"ContainerStarted","Data":"496339d52357af008f981371bcccb8fdf09a6304fb40404e6c6d38639fd7e90c"} Mar 17 11:28:08 crc kubenswrapper[4742]: I0317 11:28:08.528754 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-67kh2" event={"ID":"f0349b48-f18d-415d-bb8c-2ee11d489f9e","Type":"ContainerStarted","Data":"bd25c4b95416b6e2eb7b09d7bf211b90021386b87d7ea94fdb37163fca741ac8"} Mar 17 11:28:08 crc kubenswrapper[4742]: I0317 11:28:08.528771 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-67kh2" Mar 17 11:28:08 crc kubenswrapper[4742]: I0317 11:28:08.553977 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-67kh2" podStartSLOduration=3.553960612 podStartE2EDuration="3.553960612s" podCreationTimestamp="2026-03-17 11:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:28:08.551650088 +0000 UTC m=+991.677777886" watchObservedRunningTime="2026-03-17 11:28:08.553960612 +0000 UTC m=+991.680088370" Mar 17 11:28:13 crc kubenswrapper[4742]: I0317 11:28:13.567107 4742 generic.go:334] "Generic (PLEG): container finished" podID="11909043-e311-4bf8-9ecf-8b3d33d2584a" containerID="620946ca6f3f4548de137ac9fd5d21341c65960c1e57586b271e923e3c45577b" exitCode=0 Mar 17 11:28:13 crc kubenswrapper[4742]: I0317 11:28:13.567182 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8rkdr" event={"ID":"11909043-e311-4bf8-9ecf-8b3d33d2584a","Type":"ContainerDied","Data":"620946ca6f3f4548de137ac9fd5d21341c65960c1e57586b271e923e3c45577b"} Mar 17 11:28:13 crc kubenswrapper[4742]: I0317 11:28:13.570613 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" event={"ID":"e890c085-704d-45c9-9166-3d27780a18f6","Type":"ContainerStarted","Data":"f9312e3d9e66249dce6e8435817fbcccc69cf7cc699052d9f0999eba170af96f"} Mar 17 11:28:13 crc kubenswrapper[4742]: I0317 11:28:13.570830 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" Mar 17 11:28:13 crc kubenswrapper[4742]: I0317 11:28:13.635270 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" podStartSLOduration=2.574078302 podStartE2EDuration="8.635246518s" podCreationTimestamp="2026-03-17 11:28:05 +0000 UTC" firstStartedPulling="2026-03-17 11:28:06.979933549 +0000 UTC m=+990.106061307" lastFinishedPulling="2026-03-17 11:28:13.041101745 +0000 UTC m=+996.167229523" observedRunningTime="2026-03-17 11:28:13.632506972 +0000 UTC m=+996.758634770" watchObservedRunningTime="2026-03-17 11:28:13.635246518 +0000 UTC m=+996.761374316" Mar 17 11:28:14 crc kubenswrapper[4742]: I0317 11:28:14.582788 4742 generic.go:334] "Generic (PLEG): container finished" podID="11909043-e311-4bf8-9ecf-8b3d33d2584a" containerID="e27f71884cc085ce56865db82578cf56c01c2ad266625710008b018b74277c18" exitCode=0 Mar 17 11:28:14 crc kubenswrapper[4742]: I0317 11:28:14.582869 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8rkdr" event={"ID":"11909043-e311-4bf8-9ecf-8b3d33d2584a","Type":"ContainerDied","Data":"e27f71884cc085ce56865db82578cf56c01c2ad266625710008b018b74277c18"} Mar 17 11:28:15 crc kubenswrapper[4742]: I0317 11:28:15.599262 4742 generic.go:334] "Generic (PLEG): container finished" podID="11909043-e311-4bf8-9ecf-8b3d33d2584a" containerID="7b4d885e24afab9f3c15a078fde7092eef6c7ebe5626e87cd104de4637428d47" exitCode=0 Mar 17 11:28:15 crc kubenswrapper[4742]: I0317 11:28:15.599367 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8rkdr" event={"ID":"11909043-e311-4bf8-9ecf-8b3d33d2584a","Type":"ContainerDied","Data":"7b4d885e24afab9f3c15a078fde7092eef6c7ebe5626e87cd104de4637428d47"} Mar 17 11:28:16 crc kubenswrapper[4742]: I0317 11:28:16.630439 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8rkdr" event={"ID":"11909043-e311-4bf8-9ecf-8b3d33d2584a","Type":"ContainerStarted","Data":"c9ec4cd0f02d221c84fce3d4d6d27c60ecb3349150f20b23aaaf3cc1c1e69ec8"} Mar 17 11:28:16 crc kubenswrapper[4742]: I0317 11:28:16.630786 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8rkdr" event={"ID":"11909043-e311-4bf8-9ecf-8b3d33d2584a","Type":"ContainerStarted","Data":"2efd9f6ecc2b826d4c2ac47a04bb846da6fe4ae73680ccf418dfbf62c38822f3"} Mar 17 11:28:16 crc kubenswrapper[4742]: I0317 11:28:16.630800 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8rkdr" event={"ID":"11909043-e311-4bf8-9ecf-8b3d33d2584a","Type":"ContainerStarted","Data":"7475abb964382769f7b77f21488a774f17ee6031e35ec5f6c9a847422ea5eb0c"} Mar 17 11:28:16 crc kubenswrapper[4742]: I0317 11:28:16.630811 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8rkdr" event={"ID":"11909043-e311-4bf8-9ecf-8b3d33d2584a","Type":"ContainerStarted","Data":"a61514332116e53124517817797152f1e48e62996cae52c79cebd5fc40499b90"} Mar 17 11:28:16 crc kubenswrapper[4742]: I0317 11:28:16.630821 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8rkdr" event={"ID":"11909043-e311-4bf8-9ecf-8b3d33d2584a","Type":"ContainerStarted","Data":"cf2df5e3887e05c645bf2f5c0414d68761a8801f4b95bd4e76b59a78f681f396"} Mar 17 11:28:17 crc kubenswrapper[4742]: I0317 11:28:17.476225 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-67kh2" Mar 17 11:28:17 crc kubenswrapper[4742]: I0317 11:28:17.647045 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-8rkdr" event={"ID":"11909043-e311-4bf8-9ecf-8b3d33d2584a","Type":"ContainerStarted","Data":"4735a83982cbcca5f31092e54b2d913e9c666e50d854e0d9aa4cc477cb309d9c"} Mar 17 11:28:17 crc kubenswrapper[4742]: I0317 11:28:17.648515 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:17 crc kubenswrapper[4742]: I0317 11:28:17.695623 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-8rkdr" podStartSLOduration=5.628378135 podStartE2EDuration="12.695605176s" podCreationTimestamp="2026-03-17 11:28:05 +0000 UTC" firstStartedPulling="2026-03-17 11:28:05.982427321 +0000 UTC m=+989.108555089" lastFinishedPulling="2026-03-17 11:28:13.049654352 +0000 UTC m=+996.175782130" observedRunningTime="2026-03-17 11:28:17.691855103 +0000 UTC m=+1000.817982871" watchObservedRunningTime="2026-03-17 11:28:17.695605176 +0000 UTC m=+1000.821732944" Mar 17 11:28:18 crc kubenswrapper[4742]: I0317 11:28:18.043948 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:28:18 crc kubenswrapper[4742]: I0317 11:28:18.044367 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:28:20 crc kubenswrapper[4742]: I0317 11:28:20.392794 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-p5x4h"] Mar 17 11:28:20 crc kubenswrapper[4742]: I0317 11:28:20.394986 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p5x4h" Mar 17 11:28:20 crc kubenswrapper[4742]: I0317 11:28:20.399117 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 17 11:28:20 crc kubenswrapper[4742]: I0317 11:28:20.399844 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 17 11:28:20 crc kubenswrapper[4742]: I0317 11:28:20.400528 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p5x4h"] Mar 17 11:28:20 crc kubenswrapper[4742]: I0317 11:28:20.401841 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-sz8pd" Mar 17 11:28:20 crc kubenswrapper[4742]: I0317 11:28:20.476871 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l6jd\" (UniqueName: \"kubernetes.io/projected/f201d18f-aeaf-4598-a8a4-10703f915f1b-kube-api-access-5l6jd\") pod \"openstack-operator-index-p5x4h\" (UID: \"f201d18f-aeaf-4598-a8a4-10703f915f1b\") " pod="openstack-operators/openstack-operator-index-p5x4h" Mar 17 11:28:20 crc kubenswrapper[4742]: I0317 11:28:20.578350 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l6jd\" (UniqueName: \"kubernetes.io/projected/f201d18f-aeaf-4598-a8a4-10703f915f1b-kube-api-access-5l6jd\") pod \"openstack-operator-index-p5x4h\" (UID: \"f201d18f-aeaf-4598-a8a4-10703f915f1b\") " pod="openstack-operators/openstack-operator-index-p5x4h" Mar 17 11:28:20 crc kubenswrapper[4742]: I0317 11:28:20.604359 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l6jd\" (UniqueName: \"kubernetes.io/projected/f201d18f-aeaf-4598-a8a4-10703f915f1b-kube-api-access-5l6jd\") pod \"openstack-operator-index-p5x4h\" (UID: \"f201d18f-aeaf-4598-a8a4-10703f915f1b\") " pod="openstack-operators/openstack-operator-index-p5x4h" Mar 17 11:28:20 crc kubenswrapper[4742]: I0317 11:28:20.758099 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p5x4h" Mar 17 11:28:20 crc kubenswrapper[4742]: I0317 11:28:20.865862 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:20 crc kubenswrapper[4742]: I0317 11:28:20.937544 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:21 crc kubenswrapper[4742]: I0317 11:28:21.178039 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p5x4h"] Mar 17 11:28:21 crc kubenswrapper[4742]: I0317 11:28:21.677973 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p5x4h" event={"ID":"f201d18f-aeaf-4598-a8a4-10703f915f1b","Type":"ContainerStarted","Data":"dc16334005c3e0deaadd0ddf6ec008fa87c7e79893f1569af4d84059db62d9e4"} Mar 17 11:28:24 crc kubenswrapper[4742]: I0317 11:28:24.638924 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-p5x4h"] Mar 17 11:28:24 crc kubenswrapper[4742]: I0317 11:28:24.711880 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p5x4h" event={"ID":"f201d18f-aeaf-4598-a8a4-10703f915f1b","Type":"ContainerStarted","Data":"e5c1f14ec9934fc035ff85d9c1dd80a1eb3a4cfd3a616703303b4342dceca203"} Mar 17 11:28:24 crc kubenswrapper[4742]: I0317 11:28:24.730806 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-p5x4h" podStartSLOduration=2.254443222 podStartE2EDuration="4.730762358s" podCreationTimestamp="2026-03-17 11:28:20 +0000 UTC" firstStartedPulling="2026-03-17 11:28:21.186602689 +0000 UTC m=+1004.312730447" lastFinishedPulling="2026-03-17 11:28:23.662921825 +0000 UTC m=+1006.789049583" observedRunningTime="2026-03-17 11:28:24.728779043 +0000 UTC m=+1007.854906831" watchObservedRunningTime="2026-03-17 11:28:24.730762358 +0000 UTC m=+1007.856890126" Mar 17 11:28:25 crc kubenswrapper[4742]: I0317 11:28:25.390232 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-d2ktx"] Mar 17 11:28:25 crc kubenswrapper[4742]: I0317 11:28:25.398877 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-d2ktx" Mar 17 11:28:25 crc kubenswrapper[4742]: I0317 11:28:25.402599 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-d2ktx"] Mar 17 11:28:25 crc kubenswrapper[4742]: I0317 11:28:25.446890 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmf57\" (UniqueName: \"kubernetes.io/projected/37e024f1-44f6-48c9-ba86-323127371c28-kube-api-access-mmf57\") pod \"openstack-operator-index-d2ktx\" (UID: \"37e024f1-44f6-48c9-ba86-323127371c28\") " pod="openstack-operators/openstack-operator-index-d2ktx" Mar 17 11:28:25 crc kubenswrapper[4742]: I0317 11:28:25.548615 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmf57\" (UniqueName: \"kubernetes.io/projected/37e024f1-44f6-48c9-ba86-323127371c28-kube-api-access-mmf57\") pod \"openstack-operator-index-d2ktx\" (UID: \"37e024f1-44f6-48c9-ba86-323127371c28\") " pod="openstack-operators/openstack-operator-index-d2ktx" Mar 17 11:28:25 crc kubenswrapper[4742]: I0317 11:28:25.584374 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmf57\" (UniqueName: \"kubernetes.io/projected/37e024f1-44f6-48c9-ba86-323127371c28-kube-api-access-mmf57\") pod \"openstack-operator-index-d2ktx\" (UID: \"37e024f1-44f6-48c9-ba86-323127371c28\") " pod="openstack-operators/openstack-operator-index-d2ktx" Mar 17 11:28:25 crc kubenswrapper[4742]: I0317 11:28:25.720336 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-p5x4h" podUID="f201d18f-aeaf-4598-a8a4-10703f915f1b" containerName="registry-server" containerID="cri-o://e5c1f14ec9934fc035ff85d9c1dd80a1eb3a4cfd3a616703303b4342dceca203" gracePeriod=2 Mar 17 11:28:25 crc kubenswrapper[4742]: I0317 11:28:25.726659 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-d2ktx" Mar 17 11:28:25 crc kubenswrapper[4742]: I0317 11:28:25.867862 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-8rkdr" Mar 17 11:28:25 crc kubenswrapper[4742]: I0317 11:28:25.990149 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-497xk" Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.164296 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p5x4h" Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.248008 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-d2ktx"] Mar 17 11:28:26 crc kubenswrapper[4742]: W0317 11:28:26.253448 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37e024f1_44f6_48c9_ba86_323127371c28.slice/crio-aba449ee89829a62abca2e4e435af403a1dbdbb6e078cae1ff6d767cc5142588 WatchSource:0}: Error finding container aba449ee89829a62abca2e4e435af403a1dbdbb6e078cae1ff6d767cc5142588: Status 404 returned error can't find the container with id aba449ee89829a62abca2e4e435af403a1dbdbb6e078cae1ff6d767cc5142588 Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.264508 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l6jd\" (UniqueName: \"kubernetes.io/projected/f201d18f-aeaf-4598-a8a4-10703f915f1b-kube-api-access-5l6jd\") pod \"f201d18f-aeaf-4598-a8a4-10703f915f1b\" (UID: \"f201d18f-aeaf-4598-a8a4-10703f915f1b\") " Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.270330 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f201d18f-aeaf-4598-a8a4-10703f915f1b-kube-api-access-5l6jd" (OuterVolumeSpecName: "kube-api-access-5l6jd") pod "f201d18f-aeaf-4598-a8a4-10703f915f1b" (UID: "f201d18f-aeaf-4598-a8a4-10703f915f1b"). InnerVolumeSpecName "kube-api-access-5l6jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.366297 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l6jd\" (UniqueName: \"kubernetes.io/projected/f201d18f-aeaf-4598-a8a4-10703f915f1b-kube-api-access-5l6jd\") on node \"crc\" DevicePath \"\"" Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.486952 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-pfql6" Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.728301 4742 generic.go:334] "Generic (PLEG): container finished" podID="f201d18f-aeaf-4598-a8a4-10703f915f1b" containerID="e5c1f14ec9934fc035ff85d9c1dd80a1eb3a4cfd3a616703303b4342dceca203" exitCode=0 Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.728383 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p5x4h" Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.728384 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p5x4h" event={"ID":"f201d18f-aeaf-4598-a8a4-10703f915f1b","Type":"ContainerDied","Data":"e5c1f14ec9934fc035ff85d9c1dd80a1eb3a4cfd3a616703303b4342dceca203"} Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.728462 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p5x4h" event={"ID":"f201d18f-aeaf-4598-a8a4-10703f915f1b","Type":"ContainerDied","Data":"dc16334005c3e0deaadd0ddf6ec008fa87c7e79893f1569af4d84059db62d9e4"} Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.728491 4742 scope.go:117] "RemoveContainer" containerID="e5c1f14ec9934fc035ff85d9c1dd80a1eb3a4cfd3a616703303b4342dceca203" Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.730539 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-d2ktx" event={"ID":"37e024f1-44f6-48c9-ba86-323127371c28","Type":"ContainerStarted","Data":"54a32091088d79f27ed21d70a3e61218ccbdbeef42fabc82ed2f0be9493e489a"} Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.730576 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-d2ktx" event={"ID":"37e024f1-44f6-48c9-ba86-323127371c28","Type":"ContainerStarted","Data":"aba449ee89829a62abca2e4e435af403a1dbdbb6e078cae1ff6d767cc5142588"} Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.748996 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-p5x4h"] Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.751817 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-p5x4h"] Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.756919 4742 scope.go:117] "RemoveContainer" containerID="e5c1f14ec9934fc035ff85d9c1dd80a1eb3a4cfd3a616703303b4342dceca203" Mar 17 11:28:26 crc kubenswrapper[4742]: E0317 11:28:26.757440 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c1f14ec9934fc035ff85d9c1dd80a1eb3a4cfd3a616703303b4342dceca203\": container with ID starting with e5c1f14ec9934fc035ff85d9c1dd80a1eb3a4cfd3a616703303b4342dceca203 not found: ID does not exist" containerID="e5c1f14ec9934fc035ff85d9c1dd80a1eb3a4cfd3a616703303b4342dceca203" Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.757473 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c1f14ec9934fc035ff85d9c1dd80a1eb3a4cfd3a616703303b4342dceca203"} err="failed to get container status \"e5c1f14ec9934fc035ff85d9c1dd80a1eb3a4cfd3a616703303b4342dceca203\": rpc error: code = NotFound desc = could not find container \"e5c1f14ec9934fc035ff85d9c1dd80a1eb3a4cfd3a616703303b4342dceca203\": container with ID starting with e5c1f14ec9934fc035ff85d9c1dd80a1eb3a4cfd3a616703303b4342dceca203 not found: ID does not exist" Mar 17 11:28:26 crc kubenswrapper[4742]: I0317 11:28:26.766937 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-d2ktx" podStartSLOduration=1.713421051 podStartE2EDuration="1.766898417s" podCreationTimestamp="2026-03-17 11:28:25 +0000 UTC" firstStartedPulling="2026-03-17 11:28:26.256388356 +0000 UTC m=+1009.382516114" lastFinishedPulling="2026-03-17 11:28:26.309865682 +0000 UTC m=+1009.435993480" observedRunningTime="2026-03-17 11:28:26.760937491 +0000 UTC m=+1009.887065249" watchObservedRunningTime="2026-03-17 11:28:26.766898417 +0000 UTC m=+1009.893026175" Mar 17 11:28:28 crc kubenswrapper[4742]: I0317 11:28:28.912756 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f201d18f-aeaf-4598-a8a4-10703f915f1b" path="/var/lib/kubelet/pods/f201d18f-aeaf-4598-a8a4-10703f915f1b/volumes" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.647303 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7lpl7"] Mar 17 11:28:32 crc kubenswrapper[4742]: E0317 11:28:32.648328 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f201d18f-aeaf-4598-a8a4-10703f915f1b" containerName="registry-server" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.648353 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f201d18f-aeaf-4598-a8a4-10703f915f1b" containerName="registry-server" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.648822 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f201d18f-aeaf-4598-a8a4-10703f915f1b" containerName="registry-server" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.651695 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.682129 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7lpl7"] Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.759673 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlc5q\" (UniqueName: \"kubernetes.io/projected/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-kube-api-access-dlc5q\") pod \"community-operators-7lpl7\" (UID: \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\") " pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.760193 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-catalog-content\") pod \"community-operators-7lpl7\" (UID: \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\") " pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.760297 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-utilities\") pod \"community-operators-7lpl7\" (UID: \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\") " pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.861025 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-utilities\") pod \"community-operators-7lpl7\" (UID: \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\") " pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.861237 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlc5q\" (UniqueName: \"kubernetes.io/projected/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-kube-api-access-dlc5q\") pod \"community-operators-7lpl7\" (UID: \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\") " pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.861462 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-catalog-content\") pod \"community-operators-7lpl7\" (UID: \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\") " pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.861549 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-utilities\") pod \"community-operators-7lpl7\" (UID: \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\") " pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.861832 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-catalog-content\") pod \"community-operators-7lpl7\" (UID: \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\") " pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.884658 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlc5q\" (UniqueName: \"kubernetes.io/projected/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-kube-api-access-dlc5q\") pod \"community-operators-7lpl7\" (UID: \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\") " pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:32 crc kubenswrapper[4742]: I0317 11:28:32.985492 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:33 crc kubenswrapper[4742]: I0317 11:28:33.239139 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7lpl7"] Mar 17 11:28:33 crc kubenswrapper[4742]: I0317 11:28:33.792259 4742 generic.go:334] "Generic (PLEG): container finished" podID="a09824cc-eb9f-4e3b-b382-f9a78771fa4a" containerID="4638c3688b48a09382330e6a79545b6b16cad70d12cc00bf9e81c23f1266c1d3" exitCode=0 Mar 17 11:28:33 crc kubenswrapper[4742]: I0317 11:28:33.792370 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lpl7" event={"ID":"a09824cc-eb9f-4e3b-b382-f9a78771fa4a","Type":"ContainerDied","Data":"4638c3688b48a09382330e6a79545b6b16cad70d12cc00bf9e81c23f1266c1d3"} Mar 17 11:28:33 crc kubenswrapper[4742]: I0317 11:28:33.792548 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lpl7" event={"ID":"a09824cc-eb9f-4e3b-b382-f9a78771fa4a","Type":"ContainerStarted","Data":"7a444a07117ce0bcf2ef61bd80c754fce196874a04a2109452d1b18b765a1b4a"} Mar 17 11:28:34 crc kubenswrapper[4742]: I0317 11:28:34.803059 4742 generic.go:334] "Generic (PLEG): container finished" podID="a09824cc-eb9f-4e3b-b382-f9a78771fa4a" containerID="4ff9995834d4caf9c8f1b4bcaf7f0175a210f86408deb2f81acd707a4878671a" exitCode=0 Mar 17 11:28:34 crc kubenswrapper[4742]: I0317 11:28:34.803106 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lpl7" event={"ID":"a09824cc-eb9f-4e3b-b382-f9a78771fa4a","Type":"ContainerDied","Data":"4ff9995834d4caf9c8f1b4bcaf7f0175a210f86408deb2f81acd707a4878671a"} Mar 17 11:28:35 crc kubenswrapper[4742]: I0317 11:28:35.727789 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-d2ktx" Mar 17 11:28:35 crc kubenswrapper[4742]: I0317 11:28:35.727839 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-d2ktx" Mar 17 11:28:35 crc kubenswrapper[4742]: I0317 11:28:35.768653 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-d2ktx" Mar 17 11:28:35 crc kubenswrapper[4742]: I0317 11:28:35.813153 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lpl7" event={"ID":"a09824cc-eb9f-4e3b-b382-f9a78771fa4a","Type":"ContainerStarted","Data":"60e6a51b4513dac7766c26f05292c7b6f5e39c9bf36084e1e0fcfb1196705423"} Mar 17 11:28:35 crc kubenswrapper[4742]: I0317 11:28:35.841056 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7lpl7" podStartSLOduration=2.425388883 podStartE2EDuration="3.841039156s" podCreationTimestamp="2026-03-17 11:28:32 +0000 UTC" firstStartedPulling="2026-03-17 11:28:33.793569373 +0000 UTC m=+1016.919697151" lastFinishedPulling="2026-03-17 11:28:35.209219616 +0000 UTC m=+1018.335347424" observedRunningTime="2026-03-17 11:28:35.839331219 +0000 UTC m=+1018.965458987" watchObservedRunningTime="2026-03-17 11:28:35.841039156 +0000 UTC m=+1018.967166924" Mar 17 11:28:35 crc kubenswrapper[4742]: I0317 11:28:35.844339 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-d2ktx" Mar 17 11:28:40 crc kubenswrapper[4742]: I0317 11:28:40.121489 4742 scope.go:117] "RemoveContainer" containerID="b5fdcc36049c8777522ecab712ae2e1e1abacf4d5942a333133dec78d5882702" Mar 17 11:28:42 crc kubenswrapper[4742]: I0317 11:28:42.985930 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:42 crc kubenswrapper[4742]: I0317 11:28:42.986266 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:43 crc kubenswrapper[4742]: I0317 11:28:43.045620 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:43 crc kubenswrapper[4742]: I0317 11:28:43.917046 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:43 crc kubenswrapper[4742]: I0317 11:28:43.999481 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7lpl7"] Mar 17 11:28:45 crc kubenswrapper[4742]: I0317 11:28:45.885985 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7lpl7" podUID="a09824cc-eb9f-4e3b-b382-f9a78771fa4a" containerName="registry-server" containerID="cri-o://60e6a51b4513dac7766c26f05292c7b6f5e39c9bf36084e1e0fcfb1196705423" gracePeriod=2 Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.355847 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.457637 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-catalog-content\") pod \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\" (UID: \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\") " Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.457699 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-utilities\") pod \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\" (UID: \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\") " Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.457724 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlc5q\" (UniqueName: \"kubernetes.io/projected/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-kube-api-access-dlc5q\") pod \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\" (UID: \"a09824cc-eb9f-4e3b-b382-f9a78771fa4a\") " Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.459327 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-utilities" (OuterVolumeSpecName: "utilities") pod "a09824cc-eb9f-4e3b-b382-f9a78771fa4a" (UID: "a09824cc-eb9f-4e3b-b382-f9a78771fa4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.466838 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-kube-api-access-dlc5q" (OuterVolumeSpecName: "kube-api-access-dlc5q") pod "a09824cc-eb9f-4e3b-b382-f9a78771fa4a" (UID: "a09824cc-eb9f-4e3b-b382-f9a78771fa4a"). InnerVolumeSpecName "kube-api-access-dlc5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.519433 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a09824cc-eb9f-4e3b-b382-f9a78771fa4a" (UID: "a09824cc-eb9f-4e3b-b382-f9a78771fa4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.559132 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.559163 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.559173 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlc5q\" (UniqueName: \"kubernetes.io/projected/a09824cc-eb9f-4e3b-b382-f9a78771fa4a-kube-api-access-dlc5q\") on node \"crc\" DevicePath \"\"" Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.903556 4742 generic.go:334] "Generic (PLEG): container finished" podID="a09824cc-eb9f-4e3b-b382-f9a78771fa4a" containerID="60e6a51b4513dac7766c26f05292c7b6f5e39c9bf36084e1e0fcfb1196705423" exitCode=0 Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.903634 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lpl7" event={"ID":"a09824cc-eb9f-4e3b-b382-f9a78771fa4a","Type":"ContainerDied","Data":"60e6a51b4513dac7766c26f05292c7b6f5e39c9bf36084e1e0fcfb1196705423"} Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.903656 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7lpl7" Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.903694 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7lpl7" event={"ID":"a09824cc-eb9f-4e3b-b382-f9a78771fa4a","Type":"ContainerDied","Data":"7a444a07117ce0bcf2ef61bd80c754fce196874a04a2109452d1b18b765a1b4a"} Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.903731 4742 scope.go:117] "RemoveContainer" containerID="60e6a51b4513dac7766c26f05292c7b6f5e39c9bf36084e1e0fcfb1196705423" Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.935063 4742 scope.go:117] "RemoveContainer" containerID="4ff9995834d4caf9c8f1b4bcaf7f0175a210f86408deb2f81acd707a4878671a" Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.948431 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7lpl7"] Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.956229 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7lpl7"] Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.970137 4742 scope.go:117] "RemoveContainer" containerID="4638c3688b48a09382330e6a79545b6b16cad70d12cc00bf9e81c23f1266c1d3" Mar 17 11:28:46 crc kubenswrapper[4742]: I0317 11:28:46.999192 4742 scope.go:117] "RemoveContainer" containerID="60e6a51b4513dac7766c26f05292c7b6f5e39c9bf36084e1e0fcfb1196705423" Mar 17 11:28:47 crc kubenswrapper[4742]: E0317 11:28:47.000250 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60e6a51b4513dac7766c26f05292c7b6f5e39c9bf36084e1e0fcfb1196705423\": container with ID starting with 60e6a51b4513dac7766c26f05292c7b6f5e39c9bf36084e1e0fcfb1196705423 not found: ID does not exist" containerID="60e6a51b4513dac7766c26f05292c7b6f5e39c9bf36084e1e0fcfb1196705423" Mar 17 11:28:47 crc kubenswrapper[4742]: I0317 11:28:47.000307 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e6a51b4513dac7766c26f05292c7b6f5e39c9bf36084e1e0fcfb1196705423"} err="failed to get container status \"60e6a51b4513dac7766c26f05292c7b6f5e39c9bf36084e1e0fcfb1196705423\": rpc error: code = NotFound desc = could not find container \"60e6a51b4513dac7766c26f05292c7b6f5e39c9bf36084e1e0fcfb1196705423\": container with ID starting with 60e6a51b4513dac7766c26f05292c7b6f5e39c9bf36084e1e0fcfb1196705423 not found: ID does not exist" Mar 17 11:28:47 crc kubenswrapper[4742]: I0317 11:28:47.000346 4742 scope.go:117] "RemoveContainer" containerID="4ff9995834d4caf9c8f1b4bcaf7f0175a210f86408deb2f81acd707a4878671a" Mar 17 11:28:47 crc kubenswrapper[4742]: E0317 11:28:47.000949 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ff9995834d4caf9c8f1b4bcaf7f0175a210f86408deb2f81acd707a4878671a\": container with ID starting with 4ff9995834d4caf9c8f1b4bcaf7f0175a210f86408deb2f81acd707a4878671a not found: ID does not exist" containerID="4ff9995834d4caf9c8f1b4bcaf7f0175a210f86408deb2f81acd707a4878671a" Mar 17 11:28:47 crc kubenswrapper[4742]: I0317 11:28:47.000989 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ff9995834d4caf9c8f1b4bcaf7f0175a210f86408deb2f81acd707a4878671a"} err="failed to get container status \"4ff9995834d4caf9c8f1b4bcaf7f0175a210f86408deb2f81acd707a4878671a\": rpc error: code = NotFound desc = could not find container \"4ff9995834d4caf9c8f1b4bcaf7f0175a210f86408deb2f81acd707a4878671a\": container with ID starting with 4ff9995834d4caf9c8f1b4bcaf7f0175a210f86408deb2f81acd707a4878671a not found: ID does not exist" Mar 17 11:28:47 crc kubenswrapper[4742]: I0317 11:28:47.001031 4742 scope.go:117] "RemoveContainer" containerID="4638c3688b48a09382330e6a79545b6b16cad70d12cc00bf9e81c23f1266c1d3" Mar 17 11:28:47 crc kubenswrapper[4742]: E0317 11:28:47.001409 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4638c3688b48a09382330e6a79545b6b16cad70d12cc00bf9e81c23f1266c1d3\": container with ID starting with 4638c3688b48a09382330e6a79545b6b16cad70d12cc00bf9e81c23f1266c1d3 not found: ID does not exist" containerID="4638c3688b48a09382330e6a79545b6b16cad70d12cc00bf9e81c23f1266c1d3" Mar 17 11:28:47 crc kubenswrapper[4742]: I0317 11:28:47.001457 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4638c3688b48a09382330e6a79545b6b16cad70d12cc00bf9e81c23f1266c1d3"} err="failed to get container status \"4638c3688b48a09382330e6a79545b6b16cad70d12cc00bf9e81c23f1266c1d3\": rpc error: code = NotFound desc = could not find container \"4638c3688b48a09382330e6a79545b6b16cad70d12cc00bf9e81c23f1266c1d3\": container with ID starting with 4638c3688b48a09382330e6a79545b6b16cad70d12cc00bf9e81c23f1266c1d3 not found: ID does not exist" Mar 17 11:28:48 crc kubenswrapper[4742]: I0317 11:28:48.044188 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:28:48 crc kubenswrapper[4742]: I0317 11:28:48.044620 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:28:48 crc kubenswrapper[4742]: I0317 11:28:48.676958 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a09824cc-eb9f-4e3b-b382-f9a78771fa4a" path="/var/lib/kubelet/pods/a09824cc-eb9f-4e3b-b382-f9a78771fa4a/volumes" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.529412 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q"] Mar 17 11:28:49 crc kubenswrapper[4742]: E0317 11:28:49.529745 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09824cc-eb9f-4e3b-b382-f9a78771fa4a" containerName="registry-server" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.529765 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09824cc-eb9f-4e3b-b382-f9a78771fa4a" containerName="registry-server" Mar 17 11:28:49 crc kubenswrapper[4742]: E0317 11:28:49.529804 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09824cc-eb9f-4e3b-b382-f9a78771fa4a" containerName="extract-utilities" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.529817 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09824cc-eb9f-4e3b-b382-f9a78771fa4a" containerName="extract-utilities" Mar 17 11:28:49 crc kubenswrapper[4742]: E0317 11:28:49.529838 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09824cc-eb9f-4e3b-b382-f9a78771fa4a" containerName="extract-content" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.529850 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09824cc-eb9f-4e3b-b382-f9a78771fa4a" containerName="extract-content" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.530088 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09824cc-eb9f-4e3b-b382-f9a78771fa4a" containerName="registry-server" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.531534 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.534873 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-c8d7b" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.547669 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q"] Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.702993 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6b4bfa7-c424-4a08-8a06-f73809217eff-util\") pod \"dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q\" (UID: \"e6b4bfa7-c424-4a08-8a06-f73809217eff\") " pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.703067 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6b4bfa7-c424-4a08-8a06-f73809217eff-bundle\") pod \"dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q\" (UID: \"e6b4bfa7-c424-4a08-8a06-f73809217eff\") " pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.703204 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhtln\" (UniqueName: \"kubernetes.io/projected/e6b4bfa7-c424-4a08-8a06-f73809217eff-kube-api-access-qhtln\") pod \"dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q\" (UID: \"e6b4bfa7-c424-4a08-8a06-f73809217eff\") " pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.804741 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhtln\" (UniqueName: \"kubernetes.io/projected/e6b4bfa7-c424-4a08-8a06-f73809217eff-kube-api-access-qhtln\") pod \"dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q\" (UID: \"e6b4bfa7-c424-4a08-8a06-f73809217eff\") " pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.804941 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6b4bfa7-c424-4a08-8a06-f73809217eff-util\") pod \"dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q\" (UID: \"e6b4bfa7-c424-4a08-8a06-f73809217eff\") " pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.805058 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6b4bfa7-c424-4a08-8a06-f73809217eff-bundle\") pod \"dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q\" (UID: \"e6b4bfa7-c424-4a08-8a06-f73809217eff\") " pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.806226 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6b4bfa7-c424-4a08-8a06-f73809217eff-util\") pod \"dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q\" (UID: \"e6b4bfa7-c424-4a08-8a06-f73809217eff\") " pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.806348 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6b4bfa7-c424-4a08-8a06-f73809217eff-bundle\") pod \"dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q\" (UID: \"e6b4bfa7-c424-4a08-8a06-f73809217eff\") " pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.827958 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhtln\" (UniqueName: \"kubernetes.io/projected/e6b4bfa7-c424-4a08-8a06-f73809217eff-kube-api-access-qhtln\") pod \"dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q\" (UID: \"e6b4bfa7-c424-4a08-8a06-f73809217eff\") " pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" Mar 17 11:28:49 crc kubenswrapper[4742]: I0317 11:28:49.864561 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" Mar 17 11:28:50 crc kubenswrapper[4742]: I0317 11:28:50.300017 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q"] Mar 17 11:28:50 crc kubenswrapper[4742]: I0317 11:28:50.931733 4742 generic.go:334] "Generic (PLEG): container finished" podID="e6b4bfa7-c424-4a08-8a06-f73809217eff" containerID="99a375919f3b6d029fe79efc5212b31d3486e510b124f4f297a5077b5926d3be" exitCode=0 Mar 17 11:28:50 crc kubenswrapper[4742]: I0317 11:28:50.931833 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" event={"ID":"e6b4bfa7-c424-4a08-8a06-f73809217eff","Type":"ContainerDied","Data":"99a375919f3b6d029fe79efc5212b31d3486e510b124f4f297a5077b5926d3be"} Mar 17 11:28:50 crc kubenswrapper[4742]: I0317 11:28:50.932041 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" event={"ID":"e6b4bfa7-c424-4a08-8a06-f73809217eff","Type":"ContainerStarted","Data":"a31ff2b3a9a5669ad85dde6066f0ed5614ab232ca1c0c1940225ee5ec7f6ff46"} Mar 17 11:28:51 crc kubenswrapper[4742]: I0317 11:28:51.942422 4742 generic.go:334] "Generic (PLEG): container finished" podID="e6b4bfa7-c424-4a08-8a06-f73809217eff" containerID="9cdaff699e5095838dd64a286d8b6a2f4582f1ff5b582ed05ab6b59bb068e301" exitCode=0 Mar 17 11:28:51 crc kubenswrapper[4742]: I0317 11:28:51.942686 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" event={"ID":"e6b4bfa7-c424-4a08-8a06-f73809217eff","Type":"ContainerDied","Data":"9cdaff699e5095838dd64a286d8b6a2f4582f1ff5b582ed05ab6b59bb068e301"} Mar 17 11:28:52 crc kubenswrapper[4742]: I0317 11:28:52.953528 4742 generic.go:334] "Generic (PLEG): container finished" podID="e6b4bfa7-c424-4a08-8a06-f73809217eff" containerID="8b9c2243062095857a51a3f28af71f3b5accd26390dd2671dafde45d6a8adad4" exitCode=0 Mar 17 11:28:52 crc kubenswrapper[4742]: I0317 11:28:52.953718 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" event={"ID":"e6b4bfa7-c424-4a08-8a06-f73809217eff","Type":"ContainerDied","Data":"8b9c2243062095857a51a3f28af71f3b5accd26390dd2671dafde45d6a8adad4"} Mar 17 11:28:54 crc kubenswrapper[4742]: I0317 11:28:54.342109 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" Mar 17 11:28:54 crc kubenswrapper[4742]: I0317 11:28:54.476230 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhtln\" (UniqueName: \"kubernetes.io/projected/e6b4bfa7-c424-4a08-8a06-f73809217eff-kube-api-access-qhtln\") pod \"e6b4bfa7-c424-4a08-8a06-f73809217eff\" (UID: \"e6b4bfa7-c424-4a08-8a06-f73809217eff\") " Mar 17 11:28:54 crc kubenswrapper[4742]: I0317 11:28:54.476346 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6b4bfa7-c424-4a08-8a06-f73809217eff-util\") pod \"e6b4bfa7-c424-4a08-8a06-f73809217eff\" (UID: \"e6b4bfa7-c424-4a08-8a06-f73809217eff\") " Mar 17 11:28:54 crc kubenswrapper[4742]: I0317 11:28:54.476502 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6b4bfa7-c424-4a08-8a06-f73809217eff-bundle\") pod \"e6b4bfa7-c424-4a08-8a06-f73809217eff\" (UID: \"e6b4bfa7-c424-4a08-8a06-f73809217eff\") " Mar 17 11:28:54 crc kubenswrapper[4742]: I0317 11:28:54.477590 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b4bfa7-c424-4a08-8a06-f73809217eff-bundle" (OuterVolumeSpecName: "bundle") pod "e6b4bfa7-c424-4a08-8a06-f73809217eff" (UID: "e6b4bfa7-c424-4a08-8a06-f73809217eff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:28:54 crc kubenswrapper[4742]: I0317 11:28:54.481603 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b4bfa7-c424-4a08-8a06-f73809217eff-kube-api-access-qhtln" (OuterVolumeSpecName: "kube-api-access-qhtln") pod "e6b4bfa7-c424-4a08-8a06-f73809217eff" (UID: "e6b4bfa7-c424-4a08-8a06-f73809217eff"). InnerVolumeSpecName "kube-api-access-qhtln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:28:54 crc kubenswrapper[4742]: I0317 11:28:54.498598 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6b4bfa7-c424-4a08-8a06-f73809217eff-util" (OuterVolumeSpecName: "util") pod "e6b4bfa7-c424-4a08-8a06-f73809217eff" (UID: "e6b4bfa7-c424-4a08-8a06-f73809217eff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:28:54 crc kubenswrapper[4742]: I0317 11:28:54.578995 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhtln\" (UniqueName: \"kubernetes.io/projected/e6b4bfa7-c424-4a08-8a06-f73809217eff-kube-api-access-qhtln\") on node \"crc\" DevicePath \"\"" Mar 17 11:28:54 crc kubenswrapper[4742]: I0317 11:28:54.579062 4742 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6b4bfa7-c424-4a08-8a06-f73809217eff-util\") on node \"crc\" DevicePath \"\"" Mar 17 11:28:54 crc kubenswrapper[4742]: I0317 11:28:54.579089 4742 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6b4bfa7-c424-4a08-8a06-f73809217eff-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:28:54 crc kubenswrapper[4742]: I0317 11:28:54.977732 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" Mar 17 11:28:54 crc kubenswrapper[4742]: I0317 11:28:54.977725 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q" event={"ID":"e6b4bfa7-c424-4a08-8a06-f73809217eff","Type":"ContainerDied","Data":"a31ff2b3a9a5669ad85dde6066f0ed5614ab232ca1c0c1940225ee5ec7f6ff46"} Mar 17 11:28:54 crc kubenswrapper[4742]: I0317 11:28:54.978477 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a31ff2b3a9a5669ad85dde6066f0ed5614ab232ca1c0c1940225ee5ec7f6ff46" Mar 17 11:28:59 crc kubenswrapper[4742]: I0317 11:28:59.344220 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-58b7c959b5-zkf6c"] Mar 17 11:28:59 crc kubenswrapper[4742]: E0317 11:28:59.344985 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b4bfa7-c424-4a08-8a06-f73809217eff" containerName="pull" Mar 17 11:28:59 crc kubenswrapper[4742]: I0317 11:28:59.344996 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b4bfa7-c424-4a08-8a06-f73809217eff" containerName="pull" Mar 17 11:28:59 crc kubenswrapper[4742]: E0317 11:28:59.345006 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b4bfa7-c424-4a08-8a06-f73809217eff" containerName="util" Mar 17 11:28:59 crc kubenswrapper[4742]: I0317 11:28:59.345012 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b4bfa7-c424-4a08-8a06-f73809217eff" containerName="util" Mar 17 11:28:59 crc kubenswrapper[4742]: E0317 11:28:59.345024 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b4bfa7-c424-4a08-8a06-f73809217eff" containerName="extract" Mar 17 11:28:59 crc kubenswrapper[4742]: I0317 11:28:59.345030 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b4bfa7-c424-4a08-8a06-f73809217eff" containerName="extract" Mar 17 11:28:59 crc kubenswrapper[4742]: I0317 11:28:59.345120 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b4bfa7-c424-4a08-8a06-f73809217eff" containerName="extract" Mar 17 11:28:59 crc kubenswrapper[4742]: I0317 11:28:59.345513 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-58b7c959b5-zkf6c" Mar 17 11:28:59 crc kubenswrapper[4742]: I0317 11:28:59.347895 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-n7zph" Mar 17 11:28:59 crc kubenswrapper[4742]: I0317 11:28:59.364720 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-58b7c959b5-zkf6c"] Mar 17 11:28:59 crc kubenswrapper[4742]: I0317 11:28:59.465026 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njxrn\" (UniqueName: \"kubernetes.io/projected/30159976-f1ef-435e-b6e6-995553b51f65-kube-api-access-njxrn\") pod \"openstack-operator-controller-init-58b7c959b5-zkf6c\" (UID: \"30159976-f1ef-435e-b6e6-995553b51f65\") " pod="openstack-operators/openstack-operator-controller-init-58b7c959b5-zkf6c" Mar 17 11:28:59 crc kubenswrapper[4742]: I0317 11:28:59.566200 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njxrn\" (UniqueName: \"kubernetes.io/projected/30159976-f1ef-435e-b6e6-995553b51f65-kube-api-access-njxrn\") pod \"openstack-operator-controller-init-58b7c959b5-zkf6c\" (UID: \"30159976-f1ef-435e-b6e6-995553b51f65\") " pod="openstack-operators/openstack-operator-controller-init-58b7c959b5-zkf6c" Mar 17 11:28:59 crc kubenswrapper[4742]: I0317 11:28:59.603613 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njxrn\" (UniqueName: \"kubernetes.io/projected/30159976-f1ef-435e-b6e6-995553b51f65-kube-api-access-njxrn\") pod \"openstack-operator-controller-init-58b7c959b5-zkf6c\" (UID: \"30159976-f1ef-435e-b6e6-995553b51f65\") " pod="openstack-operators/openstack-operator-controller-init-58b7c959b5-zkf6c" Mar 17 11:28:59 crc kubenswrapper[4742]: I0317 11:28:59.660363 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-58b7c959b5-zkf6c" Mar 17 11:29:00 crc kubenswrapper[4742]: I0317 11:29:00.143653 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-58b7c959b5-zkf6c"] Mar 17 11:29:01 crc kubenswrapper[4742]: I0317 11:29:01.030797 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-58b7c959b5-zkf6c" event={"ID":"30159976-f1ef-435e-b6e6-995553b51f65","Type":"ContainerStarted","Data":"eb229b5ccac7f4c2aeb3ef31152bfd58a548dc4dcbdd281b188d90d86444d38a"} Mar 17 11:29:05 crc kubenswrapper[4742]: I0317 11:29:05.060194 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-58b7c959b5-zkf6c" event={"ID":"30159976-f1ef-435e-b6e6-995553b51f65","Type":"ContainerStarted","Data":"db5454425e85309cee79e861e6444a419652d8d35526718a0d132c0bc405b534"} Mar 17 11:29:05 crc kubenswrapper[4742]: I0317 11:29:05.060745 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-58b7c959b5-zkf6c" Mar 17 11:29:05 crc kubenswrapper[4742]: I0317 11:29:05.089168 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-58b7c959b5-zkf6c" podStartSLOduration=2.331222436 podStartE2EDuration="6.089151433s" podCreationTimestamp="2026-03-17 11:28:59 +0000 UTC" firstStartedPulling="2026-03-17 11:29:00.156167606 +0000 UTC m=+1043.282295374" lastFinishedPulling="2026-03-17 11:29:03.914096603 +0000 UTC m=+1047.040224371" observedRunningTime="2026-03-17 11:29:05.085489882 +0000 UTC m=+1048.211617680" watchObservedRunningTime="2026-03-17 11:29:05.089151433 +0000 UTC m=+1048.215279201" Mar 17 11:29:09 crc kubenswrapper[4742]: I0317 11:29:09.662933 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-58b7c959b5-zkf6c" Mar 17 11:29:18 crc kubenswrapper[4742]: I0317 11:29:18.044132 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:29:18 crc kubenswrapper[4742]: I0317 11:29:18.044630 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:29:18 crc kubenswrapper[4742]: I0317 11:29:18.044678 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:29:18 crc kubenswrapper[4742]: I0317 11:29:18.045456 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e970ab8ae9b7236a8af0e70d950c97f70be620ea87e4acbc181c30424216e493"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 11:29:18 crc kubenswrapper[4742]: I0317 11:29:18.045530 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://e970ab8ae9b7236a8af0e70d950c97f70be620ea87e4acbc181c30424216e493" gracePeriod=600 Mar 17 11:29:19 crc kubenswrapper[4742]: I0317 11:29:19.159034 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="e970ab8ae9b7236a8af0e70d950c97f70be620ea87e4acbc181c30424216e493" exitCode=0 Mar 17 11:29:19 crc kubenswrapper[4742]: I0317 11:29:19.159109 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"e970ab8ae9b7236a8af0e70d950c97f70be620ea87e4acbc181c30424216e493"} Mar 17 11:29:19 crc kubenswrapper[4742]: I0317 11:29:19.159409 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"a5ef1667f2e6dd9db693993b9f4f126e4ca6164458a0fe8e5b3f3f6b5159b8d2"} Mar 17 11:29:19 crc kubenswrapper[4742]: I0317 11:29:19.159439 4742 scope.go:117] "RemoveContainer" containerID="0b6d37342f3ee85fc8b1ee717e3e6b6ff2837c9e6e923cd75738d5afd1b0bd6d" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.647763 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-g729j"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.651080 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-g729j" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.655497 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-4phr7"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.655533 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-9cq68" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.656421 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4phr7" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.660858 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-g729j"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.663561 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-b7cfj" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.672720 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-4phr7"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.704544 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5htj\" (UniqueName: \"kubernetes.io/projected/45257cde-ca39-4e50-b465-b76ea15e179e-kube-api-access-m5htj\") pod \"barbican-operator-controller-manager-59bc569d95-g729j\" (UID: \"45257cde-ca39-4e50-b465-b76ea15e179e\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-g729j" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.704645 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbq5s\" (UniqueName: \"kubernetes.io/projected/27150936-220d-4247-b873-10add7124430-kube-api-access-dbq5s\") pod \"cinder-operator-controller-manager-8d58dc466-4phr7\" (UID: \"27150936-220d-4247-b873-10add7124430\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4phr7" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.714091 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-j5sfj"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.715018 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j5sfj" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.721665 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-cjmtl" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.730005 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-sq2xc"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.730834 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq2xc" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.737800 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-gcjsr" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.747190 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-j5sfj"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.756323 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-6z2xv"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.757281 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-6z2xv" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.760507 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-l82z9" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.773855 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-znwjl"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.774872 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-znwjl" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.783619 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-fwmbb" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.797451 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-sq2xc"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.806106 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-6z2xv"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.807687 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5htj\" (UniqueName: \"kubernetes.io/projected/45257cde-ca39-4e50-b465-b76ea15e179e-kube-api-access-m5htj\") pod \"barbican-operator-controller-manager-59bc569d95-g729j\" (UID: \"45257cde-ca39-4e50-b465-b76ea15e179e\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-g729j" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.807880 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhn6j\" (UniqueName: \"kubernetes.io/projected/01ae7820-ca74-4237-ac4a-82b3605f2306-kube-api-access-qhn6j\") pod \"designate-operator-controller-manager-588d4d986b-j5sfj\" (UID: \"01ae7820-ca74-4237-ac4a-82b3605f2306\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j5sfj" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.807992 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phv86\" (UniqueName: \"kubernetes.io/projected/9b21605a-2c83-49df-ae0f-dfb172a1b9f5-kube-api-access-phv86\") pod \"glance-operator-controller-manager-79df6bcc97-sq2xc\" (UID: \"9b21605a-2c83-49df-ae0f-dfb172a1b9f5\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq2xc" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.808069 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgr29\" (UniqueName: \"kubernetes.io/projected/a7d611a7-9728-4738-8efa-80883aa13b2b-kube-api-access-wgr29\") pod \"heat-operator-controller-manager-67dd5f86f5-6z2xv\" (UID: \"a7d611a7-9728-4738-8efa-80883aa13b2b\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-6z2xv" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.808141 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbq5s\" (UniqueName: \"kubernetes.io/projected/27150936-220d-4247-b873-10add7124430-kube-api-access-dbq5s\") pod \"cinder-operator-controller-manager-8d58dc466-4phr7\" (UID: \"27150936-220d-4247-b873-10add7124430\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4phr7" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.808265 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qktkm\" (UniqueName: \"kubernetes.io/projected/c8ccb584-e9e1-4eba-827e-3e7197f3133f-kube-api-access-qktkm\") pod \"horizon-operator-controller-manager-8464cc45fb-znwjl\" (UID: \"c8ccb584-e9e1-4eba-827e-3e7197f3133f\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-znwjl" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.820088 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.820990 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.824761 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2mszf" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.825131 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.827997 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-znwjl"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.849600 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-4mj6d"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.850348 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4mj6d" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.857837 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbq5s\" (UniqueName: \"kubernetes.io/projected/27150936-220d-4247-b873-10add7124430-kube-api-access-dbq5s\") pod \"cinder-operator-controller-manager-8d58dc466-4phr7\" (UID: \"27150936-220d-4247-b873-10add7124430\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4phr7" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.858169 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-x8vhg" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.869340 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-4mj6d"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.884154 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.891593 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dvbmd"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.892459 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dvbmd" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.905148 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5htj\" (UniqueName: \"kubernetes.io/projected/45257cde-ca39-4e50-b465-b76ea15e179e-kube-api-access-m5htj\") pod \"barbican-operator-controller-manager-59bc569d95-g729j\" (UID: \"45257cde-ca39-4e50-b465-b76ea15e179e\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-g729j" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.907127 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-rmwvq" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.911277 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qktkm\" (UniqueName: \"kubernetes.io/projected/c8ccb584-e9e1-4eba-827e-3e7197f3133f-kube-api-access-qktkm\") pod \"horizon-operator-controller-manager-8464cc45fb-znwjl\" (UID: \"c8ccb584-e9e1-4eba-827e-3e7197f3133f\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-znwjl" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.911356 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhn6j\" (UniqueName: \"kubernetes.io/projected/01ae7820-ca74-4237-ac4a-82b3605f2306-kube-api-access-qhn6j\") pod \"designate-operator-controller-manager-588d4d986b-j5sfj\" (UID: \"01ae7820-ca74-4237-ac4a-82b3605f2306\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j5sfj" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.911387 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phv86\" (UniqueName: \"kubernetes.io/projected/9b21605a-2c83-49df-ae0f-dfb172a1b9f5-kube-api-access-phv86\") pod \"glance-operator-controller-manager-79df6bcc97-sq2xc\" (UID: \"9b21605a-2c83-49df-ae0f-dfb172a1b9f5\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq2xc" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.911414 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgr29\" (UniqueName: \"kubernetes.io/projected/a7d611a7-9728-4738-8efa-80883aa13b2b-kube-api-access-wgr29\") pod \"heat-operator-controller-manager-67dd5f86f5-6z2xv\" (UID: \"a7d611a7-9728-4738-8efa-80883aa13b2b\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-6z2xv" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.911443 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4l4f\" (UniqueName: \"kubernetes.io/projected/1cdb0787-4a2a-41f6-aed0-8693b2669444-kube-api-access-b4l4f\") pod \"ironic-operator-controller-manager-6f787dddc9-4mj6d\" (UID: \"1cdb0787-4a2a-41f6-aed0-8693b2669444\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4mj6d" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.911466 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrx7h\" (UniqueName: \"kubernetes.io/projected/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-kube-api-access-rrx7h\") pod \"infra-operator-controller-manager-7b9c774f96-njktv\" (UID: \"c1f29dbe-e3d8-4dc0-aafe-fcd1de367544\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.911503 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert\") pod \"infra-operator-controller-manager-7b9c774f96-njktv\" (UID: \"c1f29dbe-e3d8-4dc0-aafe-fcd1de367544\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.911526 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr6cc\" (UniqueName: \"kubernetes.io/projected/b3928371-ca20-41d9-8200-36410c2df752-kube-api-access-mr6cc\") pod \"keystone-operator-controller-manager-768b96df4c-dvbmd\" (UID: \"b3928371-ca20-41d9-8200-36410c2df752\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dvbmd" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.922333 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-xjp4g"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.932554 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xjp4g" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.942329 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7l24m" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.956572 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qktkm\" (UniqueName: \"kubernetes.io/projected/c8ccb584-e9e1-4eba-827e-3e7197f3133f-kube-api-access-qktkm\") pod \"horizon-operator-controller-manager-8464cc45fb-znwjl\" (UID: \"c8ccb584-e9e1-4eba-827e-3e7197f3133f\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-znwjl" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.961593 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-xjp4g"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.964504 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhn6j\" (UniqueName: \"kubernetes.io/projected/01ae7820-ca74-4237-ac4a-82b3605f2306-kube-api-access-qhn6j\") pod \"designate-operator-controller-manager-588d4d986b-j5sfj\" (UID: \"01ae7820-ca74-4237-ac4a-82b3605f2306\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j5sfj" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.966042 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dvbmd"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.971007 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgr29\" (UniqueName: \"kubernetes.io/projected/a7d611a7-9728-4738-8efa-80883aa13b2b-kube-api-access-wgr29\") pod \"heat-operator-controller-manager-67dd5f86f5-6z2xv\" (UID: \"a7d611a7-9728-4738-8efa-80883aa13b2b\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-6z2xv" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.972620 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phv86\" (UniqueName: \"kubernetes.io/projected/9b21605a-2c83-49df-ae0f-dfb172a1b9f5-kube-api-access-phv86\") pod \"glance-operator-controller-manager-79df6bcc97-sq2xc\" (UID: \"9b21605a-2c83-49df-ae0f-dfb172a1b9f5\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq2xc" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.977054 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-g729j" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.987526 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4phr7" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.993974 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-7ttcf"] Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.995263 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-7ttcf" Mar 17 11:29:42 crc kubenswrapper[4742]: I0317 11:29:42.999688 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-9nk8q" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.015083 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4l4f\" (UniqueName: \"kubernetes.io/projected/1cdb0787-4a2a-41f6-aed0-8693b2669444-kube-api-access-b4l4f\") pod \"ironic-operator-controller-manager-6f787dddc9-4mj6d\" (UID: \"1cdb0787-4a2a-41f6-aed0-8693b2669444\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4mj6d" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.015118 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrx7h\" (UniqueName: \"kubernetes.io/projected/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-kube-api-access-rrx7h\") pod \"infra-operator-controller-manager-7b9c774f96-njktv\" (UID: \"c1f29dbe-e3d8-4dc0-aafe-fcd1de367544\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.015155 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert\") pod \"infra-operator-controller-manager-7b9c774f96-njktv\" (UID: \"c1f29dbe-e3d8-4dc0-aafe-fcd1de367544\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.015177 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr6cc\" (UniqueName: \"kubernetes.io/projected/b3928371-ca20-41d9-8200-36410c2df752-kube-api-access-mr6cc\") pod \"keystone-operator-controller-manager-768b96df4c-dvbmd\" (UID: \"b3928371-ca20-41d9-8200-36410c2df752\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dvbmd" Mar 17 11:29:43 crc kubenswrapper[4742]: E0317 11:29:43.015731 4742 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 11:29:43 crc kubenswrapper[4742]: E0317 11:29:43.015779 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert podName:c1f29dbe-e3d8-4dc0-aafe-fcd1de367544 nodeName:}" failed. No retries permitted until 2026-03-17 11:29:43.51576295 +0000 UTC m=+1086.641890708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert") pod "infra-operator-controller-manager-7b9c774f96-njktv" (UID: "c1f29dbe-e3d8-4dc0-aafe-fcd1de367544") : secret "infra-operator-webhook-server-cert" not found Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.018069 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-7ttcf"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.039117 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-g252s"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.041640 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-g252s" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.046739 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr6cc\" (UniqueName: \"kubernetes.io/projected/b3928371-ca20-41d9-8200-36410c2df752-kube-api-access-mr6cc\") pod \"keystone-operator-controller-manager-768b96df4c-dvbmd\" (UID: \"b3928371-ca20-41d9-8200-36410c2df752\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dvbmd" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.046985 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j5sfj" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.051589 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-bh8nr" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.058019 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-vshmg"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.059118 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vshmg" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.059805 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq2xc" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.060531 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-pwbdt" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.062889 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4l4f\" (UniqueName: \"kubernetes.io/projected/1cdb0787-4a2a-41f6-aed0-8693b2669444-kube-api-access-b4l4f\") pod \"ironic-operator-controller-manager-6f787dddc9-4mj6d\" (UID: \"1cdb0787-4a2a-41f6-aed0-8693b2669444\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4mj6d" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.065364 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrx7h\" (UniqueName: \"kubernetes.io/projected/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-kube-api-access-rrx7h\") pod \"infra-operator-controller-manager-7b9c774f96-njktv\" (UID: \"c1f29dbe-e3d8-4dc0-aafe-fcd1de367544\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.067188 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-g252s"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.079958 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-4fvjv"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.080565 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-6z2xv" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.080750 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4fvjv" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.084486 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-4fvjv"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.086250 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-cmr8k" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.096049 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-znwjl" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.099867 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.100634 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.103934 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.104440 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-9c86w" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.119087 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-vshmg"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.123822 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsls5\" (UniqueName: \"kubernetes.io/projected/c59e15b4-2341-4b9e-8887-d6b1f594dc0e-kube-api-access-gsls5\") pod \"octavia-operator-controller-manager-5b9f45d989-4fvjv\" (UID: \"c59e15b4-2341-4b9e-8887-d6b1f594dc0e\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4fvjv" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.123892 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cpcx\" (UniqueName: \"kubernetes.io/projected/f91fb07a-de67-44ff-b6af-446891941a60-kube-api-access-9cpcx\") pod \"manila-operator-controller-manager-55f864c847-xjp4g\" (UID: \"f91fb07a-de67-44ff-b6af-446891941a60\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-xjp4g" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.123937 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-89w9s\" (UID: \"7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.123988 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsb9k\" (UniqueName: \"kubernetes.io/projected/0436441e-c132-4c65-aee5-8b20461c12e1-kube-api-access-lsb9k\") pod \"neutron-operator-controller-manager-767865f676-g252s\" (UID: \"0436441e-c132-4c65-aee5-8b20461c12e1\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-g252s" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.124025 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pjwh\" (UniqueName: \"kubernetes.io/projected/88b49b71-3d6b-4ca0-8943-c0d0c10b9ff9-kube-api-access-9pjwh\") pod \"nova-operator-controller-manager-5d488d59fb-vshmg\" (UID: \"88b49b71-3d6b-4ca0-8943-c0d0c10b9ff9\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vshmg" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.124068 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhnj2\" (UniqueName: \"kubernetes.io/projected/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-kube-api-access-jhnj2\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-89w9s\" (UID: \"7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.124088 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx6t2\" (UniqueName: \"kubernetes.io/projected/46b5befe-2274-4bc8-a2c4-ce8a9fc915ae-kube-api-access-hx6t2\") pod \"mariadb-operator-controller-manager-67ccfc9778-7ttcf\" (UID: \"46b5befe-2274-4bc8-a2c4-ce8a9fc915ae\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-7ttcf" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.149861 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.166805 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.167952 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.172565 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-xgwnm" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.173115 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.193709 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-zvlv9"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.194551 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zvlv9" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.207297 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.208346 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.217623 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vdt6f" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.218323 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-gx84t" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.228054 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsls5\" (UniqueName: \"kubernetes.io/projected/c59e15b4-2341-4b9e-8887-d6b1f594dc0e-kube-api-access-gsls5\") pod \"octavia-operator-controller-manager-5b9f45d989-4fvjv\" (UID: \"c59e15b4-2341-4b9e-8887-d6b1f594dc0e\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4fvjv" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.228278 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cpcx\" (UniqueName: \"kubernetes.io/projected/f91fb07a-de67-44ff-b6af-446891941a60-kube-api-access-9cpcx\") pod \"manila-operator-controller-manager-55f864c847-xjp4g\" (UID: \"f91fb07a-de67-44ff-b6af-446891941a60\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-xjp4g" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.228330 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-89w9s\" (UID: \"7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.228392 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsb9k\" (UniqueName: \"kubernetes.io/projected/0436441e-c132-4c65-aee5-8b20461c12e1-kube-api-access-lsb9k\") pod \"neutron-operator-controller-manager-767865f676-g252s\" (UID: \"0436441e-c132-4c65-aee5-8b20461c12e1\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-g252s" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.228430 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pjwh\" (UniqueName: \"kubernetes.io/projected/88b49b71-3d6b-4ca0-8943-c0d0c10b9ff9-kube-api-access-9pjwh\") pod \"nova-operator-controller-manager-5d488d59fb-vshmg\" (UID: \"88b49b71-3d6b-4ca0-8943-c0d0c10b9ff9\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vshmg" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.228563 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhnj2\" (UniqueName: \"kubernetes.io/projected/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-kube-api-access-jhnj2\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-89w9s\" (UID: \"7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.228593 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx6t2\" (UniqueName: \"kubernetes.io/projected/46b5befe-2274-4bc8-a2c4-ce8a9fc915ae-kube-api-access-hx6t2\") pod \"mariadb-operator-controller-manager-67ccfc9778-7ttcf\" (UID: \"46b5befe-2274-4bc8-a2c4-ce8a9fc915ae\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-7ttcf" Mar 17 11:29:43 crc kubenswrapper[4742]: E0317 11:29:43.230699 4742 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 11:29:43 crc kubenswrapper[4742]: E0317 11:29:43.230749 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert podName:7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0 nodeName:}" failed. No retries permitted until 2026-03-17 11:29:43.730733482 +0000 UTC m=+1086.856861240 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-89w9s" (UID: "7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.232259 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-zvlv9"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.237260 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.257599 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsb9k\" (UniqueName: \"kubernetes.io/projected/0436441e-c132-4c65-aee5-8b20461c12e1-kube-api-access-lsb9k\") pod \"neutron-operator-controller-manager-767865f676-g252s\" (UID: \"0436441e-c132-4c65-aee5-8b20461c12e1\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-g252s" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.261154 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsls5\" (UniqueName: \"kubernetes.io/projected/c59e15b4-2341-4b9e-8887-d6b1f594dc0e-kube-api-access-gsls5\") pod \"octavia-operator-controller-manager-5b9f45d989-4fvjv\" (UID: \"c59e15b4-2341-4b9e-8887-d6b1f594dc0e\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4fvjv" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.262095 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pjwh\" (UniqueName: \"kubernetes.io/projected/88b49b71-3d6b-4ca0-8943-c0d0c10b9ff9-kube-api-access-9pjwh\") pod \"nova-operator-controller-manager-5d488d59fb-vshmg\" (UID: \"88b49b71-3d6b-4ca0-8943-c0d0c10b9ff9\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vshmg" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.267666 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.267737 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4mj6d" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.268816 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.271573 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cpcx\" (UniqueName: \"kubernetes.io/projected/f91fb07a-de67-44ff-b6af-446891941a60-kube-api-access-9cpcx\") pod \"manila-operator-controller-manager-55f864c847-xjp4g\" (UID: \"f91fb07a-de67-44ff-b6af-446891941a60\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-xjp4g" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.274344 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-wzr7z" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.274453 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx6t2\" (UniqueName: \"kubernetes.io/projected/46b5befe-2274-4bc8-a2c4-ce8a9fc915ae-kube-api-access-hx6t2\") pod \"mariadb-operator-controller-manager-67ccfc9778-7ttcf\" (UID: \"46b5befe-2274-4bc8-a2c4-ce8a9fc915ae\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-7ttcf" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.283571 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.284065 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhnj2\" (UniqueName: \"kubernetes.io/projected/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-kube-api-access-jhnj2\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-89w9s\" (UID: \"7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.315325 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dvbmd" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.325192 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.325870 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.333956 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-2fsgl" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.334573 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.395479 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xjp4g" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.396792 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkm2n\" (UniqueName: \"kubernetes.io/projected/53837a21-9249-4ff8-aa95-bdfbb6d49f33-kube-api-access-vkm2n\") pod \"swift-operator-controller-manager-c674c5965-fh8v8\" (UID: \"53837a21-9249-4ff8-aa95-bdfbb6d49f33\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.396823 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg2r7\" (UniqueName: \"kubernetes.io/projected/b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee-kube-api-access-fg2r7\") pod \"test-operator-controller-manager-5c5cb9c4d7-fhqr4\" (UID: \"b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.396854 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x7kr\" (UniqueName: \"kubernetes.io/projected/f42b3e9f-55a9-47fe-a5b8-51b36d622657-kube-api-access-2x7kr\") pod \"telemetry-operator-controller-manager-d6b694c5-rzpkl\" (UID: \"f42b3e9f-55a9-47fe-a5b8-51b36d622657\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.396893 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fpnk\" (UniqueName: \"kubernetes.io/projected/5e3c7784-527e-4f97-b035-240b7014241f-kube-api-access-2fpnk\") pod \"placement-operator-controller-manager-5784578c99-c9j2m\" (UID: \"5e3c7784-527e-4f97-b035-240b7014241f\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.396931 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glvml\" (UniqueName: \"kubernetes.io/projected/9470c17e-90c4-4723-b3ef-af8ec6f1edc2-kube-api-access-glvml\") pod \"ovn-operator-controller-manager-884679f54-zvlv9\" (UID: \"9470c17e-90c4-4723-b3ef-af8ec6f1edc2\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-zvlv9" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.397186 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-7ttcf" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.411807 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-g252s" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.448212 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vshmg" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.452832 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.457493 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4fvjv" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.462850 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.534832 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkm2n\" (UniqueName: \"kubernetes.io/projected/53837a21-9249-4ff8-aa95-bdfbb6d49f33-kube-api-access-vkm2n\") pod \"swift-operator-controller-manager-c674c5965-fh8v8\" (UID: \"53837a21-9249-4ff8-aa95-bdfbb6d49f33\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.534890 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg2r7\" (UniqueName: \"kubernetes.io/projected/b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee-kube-api-access-fg2r7\") pod \"test-operator-controller-manager-5c5cb9c4d7-fhqr4\" (UID: \"b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.534995 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x7kr\" (UniqueName: \"kubernetes.io/projected/f42b3e9f-55a9-47fe-a5b8-51b36d622657-kube-api-access-2x7kr\") pod \"telemetry-operator-controller-manager-d6b694c5-rzpkl\" (UID: \"f42b3e9f-55a9-47fe-a5b8-51b36d622657\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.535042 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fpnk\" (UniqueName: \"kubernetes.io/projected/5e3c7784-527e-4f97-b035-240b7014241f-kube-api-access-2fpnk\") pod \"placement-operator-controller-manager-5784578c99-c9j2m\" (UID: \"5e3c7784-527e-4f97-b035-240b7014241f\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.535076 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glvml\" (UniqueName: \"kubernetes.io/projected/9470c17e-90c4-4723-b3ef-af8ec6f1edc2-kube-api-access-glvml\") pod \"ovn-operator-controller-manager-884679f54-zvlv9\" (UID: \"9470c17e-90c4-4723-b3ef-af8ec6f1edc2\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-zvlv9" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.535141 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert\") pod \"infra-operator-controller-manager-7b9c774f96-njktv\" (UID: \"c1f29dbe-e3d8-4dc0-aafe-fcd1de367544\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:29:43 crc kubenswrapper[4742]: E0317 11:29:43.535326 4742 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 11:29:43 crc kubenswrapper[4742]: E0317 11:29:43.535389 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert podName:c1f29dbe-e3d8-4dc0-aafe-fcd1de367544 nodeName:}" failed. No retries permitted until 2026-03-17 11:29:44.535368274 +0000 UTC m=+1087.661496032 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert") pod "infra-operator-controller-manager-7b9c774f96-njktv" (UID: "c1f29dbe-e3d8-4dc0-aafe-fcd1de367544") : secret "infra-operator-webhook-server-cert" not found Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.536915 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-c8c4x" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.554057 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.580553 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glvml\" (UniqueName: \"kubernetes.io/projected/9470c17e-90c4-4723-b3ef-af8ec6f1edc2-kube-api-access-glvml\") pod \"ovn-operator-controller-manager-884679f54-zvlv9\" (UID: \"9470c17e-90c4-4723-b3ef-af8ec6f1edc2\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-zvlv9" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.583444 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkm2n\" (UniqueName: \"kubernetes.io/projected/53837a21-9249-4ff8-aa95-bdfbb6d49f33-kube-api-access-vkm2n\") pod \"swift-operator-controller-manager-c674c5965-fh8v8\" (UID: \"53837a21-9249-4ff8-aa95-bdfbb6d49f33\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.583859 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x7kr\" (UniqueName: \"kubernetes.io/projected/f42b3e9f-55a9-47fe-a5b8-51b36d622657-kube-api-access-2x7kr\") pod \"telemetry-operator-controller-manager-d6b694c5-rzpkl\" (UID: \"f42b3e9f-55a9-47fe-a5b8-51b36d622657\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.584461 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fpnk\" (UniqueName: \"kubernetes.io/projected/5e3c7784-527e-4f97-b035-240b7014241f-kube-api-access-2fpnk\") pod \"placement-operator-controller-manager-5784578c99-c9j2m\" (UID: \"5e3c7784-527e-4f97-b035-240b7014241f\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.596116 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg2r7\" (UniqueName: \"kubernetes.io/projected/b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee-kube-api-access-fg2r7\") pod \"test-operator-controller-manager-5c5cb9c4d7-fhqr4\" (UID: \"b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.605150 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.630071 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.631018 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.636940 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.637115 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-tmhx7" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.637217 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.639365 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.640040 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8ftm\" (UniqueName: \"kubernetes.io/projected/0eaedaeb-8d0d-4fde-8b74-cdd689d56123-kube-api-access-b8ftm\") pod \"watcher-operator-controller-manager-6c4d75f7f9-rf6p4\" (UID: \"0eaedaeb-8d0d-4fde-8b74-cdd689d56123\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.679495 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.680403 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.694376 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-q82jh" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.717327 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.741292 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-89w9s\" (UID: \"7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.741357 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6fbv\" (UniqueName: \"kubernetes.io/projected/7d6829e2-3788-4653-91e4-bff007a7bb5d-kube-api-access-c6fbv\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.741439 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8ftm\" (UniqueName: \"kubernetes.io/projected/0eaedaeb-8d0d-4fde-8b74-cdd689d56123-kube-api-access-b8ftm\") pod \"watcher-operator-controller-manager-6c4d75f7f9-rf6p4\" (UID: \"0eaedaeb-8d0d-4fde-8b74-cdd689d56123\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4" Mar 17 11:29:43 crc kubenswrapper[4742]: E0317 11:29:43.741449 4742 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.741469 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.741491 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:43 crc kubenswrapper[4742]: E0317 11:29:43.741526 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert podName:7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0 nodeName:}" failed. No retries permitted until 2026-03-17 11:29:44.74150731 +0000 UTC m=+1087.867635068 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-89w9s" (UID: "7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.760806 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8ftm\" (UniqueName: \"kubernetes.io/projected/0eaedaeb-8d0d-4fde-8b74-cdd689d56123-kube-api-access-b8ftm\") pod \"watcher-operator-controller-manager-6c4d75f7f9-rf6p4\" (UID: \"0eaedaeb-8d0d-4fde-8b74-cdd689d56123\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.788719 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.806360 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-j5sfj"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.810408 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zvlv9" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.818042 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-4phr7"] Mar 17 11:29:43 crc kubenswrapper[4742]: W0317 11:29:43.838684 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27150936_220d_4247_b873_10add7124430.slice/crio-831585a36871deda1463bb230a54686c5c94e0b964f8ccd397bd8d7a8f1f9600 WatchSource:0}: Error finding container 831585a36871deda1463bb230a54686c5c94e0b964f8ccd397bd8d7a8f1f9600: Status 404 returned error can't find the container with id 831585a36871deda1463bb230a54686c5c94e0b964f8ccd397bd8d7a8f1f9600 Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.841082 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.843043 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.843074 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qprj\" (UniqueName: \"kubernetes.io/projected/48d26de5-4809-4a61-82c3-03cbf56c57b0-kube-api-access-4qprj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-44jhz\" (UID: \"48d26de5-4809-4a61-82c3-03cbf56c57b0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.843095 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.843160 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6fbv\" (UniqueName: \"kubernetes.io/projected/7d6829e2-3788-4653-91e4-bff007a7bb5d-kube-api-access-c6fbv\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:43 crc kubenswrapper[4742]: E0317 11:29:43.844131 4742 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 11:29:43 crc kubenswrapper[4742]: E0317 11:29:43.844167 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs podName:7d6829e2-3788-4653-91e4-bff007a7bb5d nodeName:}" failed. No retries permitted until 2026-03-17 11:29:44.344153542 +0000 UTC m=+1087.470281300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs") pod "openstack-operator-controller-manager-c748c4754-6hffs" (UID: "7d6829e2-3788-4653-91e4-bff007a7bb5d") : secret "metrics-server-cert" not found Mar 17 11:29:43 crc kubenswrapper[4742]: E0317 11:29:43.844227 4742 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 11:29:43 crc kubenswrapper[4742]: E0317 11:29:43.844248 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs podName:7d6829e2-3788-4653-91e4-bff007a7bb5d nodeName:}" failed. No retries permitted until 2026-03-17 11:29:44.344242044 +0000 UTC m=+1087.470369802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs") pod "openstack-operator-controller-manager-c748c4754-6hffs" (UID: "7d6829e2-3788-4653-91e4-bff007a7bb5d") : secret "webhook-server-cert" not found Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.846561 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4" Mar 17 11:29:43 crc kubenswrapper[4742]: W0317 11:29:43.846645 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45257cde_ca39_4e50_b465_b76ea15e179e.slice/crio-60aac0db81fc27afb7bece9fc6c37c2358118d732be0a2d1a80f58ea0c705af7 WatchSource:0}: Error finding container 60aac0db81fc27afb7bece9fc6c37c2358118d732be0a2d1a80f58ea0c705af7: Status 404 returned error can't find the container with id 60aac0db81fc27afb7bece9fc6c37c2358118d732be0a2d1a80f58ea0c705af7 Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.848359 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-g729j"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.866918 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6fbv\" (UniqueName: \"kubernetes.io/projected/7d6829e2-3788-4653-91e4-bff007a7bb5d-kube-api-access-c6fbv\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.870133 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.907450 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-6z2xv"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.916708 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-sq2xc"] Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.945698 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qprj\" (UniqueName: \"kubernetes.io/projected/48d26de5-4809-4a61-82c3-03cbf56c57b0-kube-api-access-4qprj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-44jhz\" (UID: \"48d26de5-4809-4a61-82c3-03cbf56c57b0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz" Mar 17 11:29:43 crc kubenswrapper[4742]: I0317 11:29:43.964638 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qprj\" (UniqueName: \"kubernetes.io/projected/48d26de5-4809-4a61-82c3-03cbf56c57b0-kube-api-access-4qprj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-44jhz\" (UID: \"48d26de5-4809-4a61-82c3-03cbf56c57b0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz" Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.002950 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-znwjl"] Mar 17 11:29:44 crc kubenswrapper[4742]: W0317 11:29:44.034835 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8ccb584_e9e1_4eba_827e_3e7197f3133f.slice/crio-0147a0c5830270b09e9299c236d96afe5f7dfa447ab2de39fd3ea1f768abbbb9 WatchSource:0}: Error finding container 0147a0c5830270b09e9299c236d96afe5f7dfa447ab2de39fd3ea1f768abbbb9: Status 404 returned error can't find the container with id 0147a0c5830270b09e9299c236d96afe5f7dfa447ab2de39fd3ea1f768abbbb9 Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.036145 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz" Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.138477 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-4mj6d"] Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.150668 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dvbmd"] Mar 17 11:29:44 crc kubenswrapper[4742]: W0317 11:29:44.157549 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3928371_ca20_41d9_8200_36410c2df752.slice/crio-72c10fe7e390308b40ac8f95958304bf56b99fdea5fffdbd376c9bd803b20b4b WatchSource:0}: Error finding container 72c10fe7e390308b40ac8f95958304bf56b99fdea5fffdbd376c9bd803b20b4b: Status 404 returned error can't find the container with id 72c10fe7e390308b40ac8f95958304bf56b99fdea5fffdbd376c9bd803b20b4b Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.201476 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-xjp4g"] Mar 17 11:29:44 crc kubenswrapper[4742]: W0317 11:29:44.235408 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf91fb07a_de67_44ff_b6af_446891941a60.slice/crio-d828afac8f52be8209b4e60d1ea920b46cc4e8e1185b6b9ede57e6a40fe2eb1b WatchSource:0}: Error finding container d828afac8f52be8209b4e60d1ea920b46cc4e8e1185b6b9ede57e6a40fe2eb1b: Status 404 returned error can't find the container with id d828afac8f52be8209b4e60d1ea920b46cc4e8e1185b6b9ede57e6a40fe2eb1b Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.248059 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-7ttcf"] Mar 17 11:29:44 crc kubenswrapper[4742]: W0317 11:29:44.255962 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc59e15b4_2341_4b9e_8887_d6b1f594dc0e.slice/crio-0a8963690385908e4ac2b838e6f14778cf058fe151d8e3b1ece49a069388cff2 WatchSource:0}: Error finding container 0a8963690385908e4ac2b838e6f14778cf058fe151d8e3b1ece49a069388cff2: Status 404 returned error can't find the container with id 0a8963690385908e4ac2b838e6f14778cf058fe151d8e3b1ece49a069388cff2 Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.257853 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-4fvjv"] Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.274246 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-vshmg"] Mar 17 11:29:44 crc kubenswrapper[4742]: W0317 11:29:44.278488 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88b49b71_3d6b_4ca0_8943_c0d0c10b9ff9.slice/crio-92ca8c57587ec2bf8f258c7092113c68891719eec571d6b63681e53dff7d4380 WatchSource:0}: Error finding container 92ca8c57587ec2bf8f258c7092113c68891719eec571d6b63681e53dff7d4380: Status 404 returned error can't find the container with id 92ca8c57587ec2bf8f258c7092113c68891719eec571d6b63681e53dff7d4380 Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.349406 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j5sfj" event={"ID":"01ae7820-ca74-4237-ac4a-82b3605f2306","Type":"ContainerStarted","Data":"1af6b936f2971db115965359ee8b99c0cf3d64e4b8ae57c617db24575d988072"} Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.350133 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.350169 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.350495 4742 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.350542 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs podName:7d6829e2-3788-4653-91e4-bff007a7bb5d nodeName:}" failed. No retries permitted until 2026-03-17 11:29:45.350530217 +0000 UTC m=+1088.476657975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs") pod "openstack-operator-controller-manager-c748c4754-6hffs" (UID: "7d6829e2-3788-4653-91e4-bff007a7bb5d") : secret "webhook-server-cert" not found Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.350720 4742 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.350796 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs podName:7d6829e2-3788-4653-91e4-bff007a7bb5d nodeName:}" failed. No retries permitted until 2026-03-17 11:29:45.350775234 +0000 UTC m=+1088.476902992 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs") pod "openstack-operator-controller-manager-c748c4754-6hffs" (UID: "7d6829e2-3788-4653-91e4-bff007a7bb5d") : secret "metrics-server-cert" not found Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.357016 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl"] Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.357603 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xjp4g" event={"ID":"f91fb07a-de67-44ff-b6af-446891941a60","Type":"ContainerStarted","Data":"d828afac8f52be8209b4e60d1ea920b46cc4e8e1185b6b9ede57e6a40fe2eb1b"} Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.358797 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4mj6d" event={"ID":"1cdb0787-4a2a-41f6-aed0-8693b2669444","Type":"ContainerStarted","Data":"519d508aaeeb68fd66e28e616ee06a3b64467d01f0c4d673e9fe0b782a187f6f"} Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.359622 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vshmg" event={"ID":"88b49b71-3d6b-4ca0-8943-c0d0c10b9ff9","Type":"ContainerStarted","Data":"92ca8c57587ec2bf8f258c7092113c68891719eec571d6b63681e53dff7d4380"} Mar 17 11:29:44 crc kubenswrapper[4742]: W0317 11:29:44.360351 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0436441e_c132_4c65_aee5_8b20461c12e1.slice/crio-5468b0a2d2880d3e389a2688a7c5ac5bb88383d202c65f11008c09a8aa69d89f WatchSource:0}: Error finding container 5468b0a2d2880d3e389a2688a7c5ac5bb88383d202c65f11008c09a8aa69d89f: Status 404 returned error can't find the container with id 5468b0a2d2880d3e389a2688a7c5ac5bb88383d202c65f11008c09a8aa69d89f Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.360704 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-7ttcf" event={"ID":"46b5befe-2274-4bc8-a2c4-ce8a9fc915ae","Type":"ContainerStarted","Data":"af9d020b351f9618cce71f21b778a796183ba8636f6b91a8a0d3576c2a3c1171"} Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.362095 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4fvjv" event={"ID":"c59e15b4-2341-4b9e-8887-d6b1f594dc0e","Type":"ContainerStarted","Data":"0a8963690385908e4ac2b838e6f14778cf058fe151d8e3b1ece49a069388cff2"} Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.362266 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-g252s"] Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.362879 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lsb9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-g252s_openstack-operators(0436441e-c132-4c65-aee5-8b20461c12e1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.366022 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-g252s" podUID="0436441e-c132-4c65-aee5-8b20461c12e1" Mar 17 11:29:44 crc kubenswrapper[4742]: W0317 11:29:44.366125 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf42b3e9f_55a9_47fe_a5b8_51b36d622657.slice/crio-b8cd06ca978a2037e15bcb702476775df3915f2106a734c8e05fb8bc04fc2be8 WatchSource:0}: Error finding container b8cd06ca978a2037e15bcb702476775df3915f2106a734c8e05fb8bc04fc2be8: Status 404 returned error can't find the container with id b8cd06ca978a2037e15bcb702476775df3915f2106a734c8e05fb8bc04fc2be8 Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.366650 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dvbmd" event={"ID":"b3928371-ca20-41d9-8200-36410c2df752","Type":"ContainerStarted","Data":"72c10fe7e390308b40ac8f95958304bf56b99fdea5fffdbd376c9bd803b20b4b"} Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.368321 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4phr7" event={"ID":"27150936-220d-4247-b873-10add7124430","Type":"ContainerStarted","Data":"831585a36871deda1463bb230a54686c5c94e0b964f8ccd397bd8d7a8f1f9600"} Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.368414 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2x7kr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-rzpkl_openstack-operators(f42b3e9f-55a9-47fe-a5b8-51b36d622657): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.369532 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl" podUID="f42b3e9f-55a9-47fe-a5b8-51b36d622657" Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.371026 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-6z2xv" event={"ID":"a7d611a7-9728-4738-8efa-80883aa13b2b","Type":"ContainerStarted","Data":"4ed635bb53ff50641ca84d9519ecec9bd57346b9246284312c5dc1e3c2fd56be"} Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.374300 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq2xc" event={"ID":"9b21605a-2c83-49df-ae0f-dfb172a1b9f5","Type":"ContainerStarted","Data":"99803cbfe6bba67a9a34a195e57b121dec85fc9f9bc211979d92f66c86ac4ce6"} Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.376080 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-znwjl" event={"ID":"c8ccb584-e9e1-4eba-827e-3e7197f3133f","Type":"ContainerStarted","Data":"0147a0c5830270b09e9299c236d96afe5f7dfa447ab2de39fd3ea1f768abbbb9"} Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.377384 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-g729j" event={"ID":"45257cde-ca39-4e50-b465-b76ea15e179e","Type":"ContainerStarted","Data":"60aac0db81fc27afb7bece9fc6c37c2358118d732be0a2d1a80f58ea0c705af7"} Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.454754 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-zvlv9"] Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.461585 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m"] Mar 17 11:29:44 crc kubenswrapper[4742]: W0317 11:29:44.474076 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e3c7784_527e_4f97_b035_240b7014241f.slice/crio-6ae4bd731a254826a16ed7aafbb98d14385790cee68c746a32d07330085eda49 WatchSource:0}: Error finding container 6ae4bd731a254826a16ed7aafbb98d14385790cee68c746a32d07330085eda49: Status 404 returned error can't find the container with id 6ae4bd731a254826a16ed7aafbb98d14385790cee68c746a32d07330085eda49 Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.474883 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz"] Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.476438 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2fpnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-c9j2m_openstack-operators(5e3c7784-527e-4f97-b035-240b7014241f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.477586 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m" podUID="5e3c7784-527e-4f97-b035-240b7014241f" Mar 17 11:29:44 crc kubenswrapper[4742]: W0317 11:29:44.477881 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48d26de5_4809_4a61_82c3_03cbf56c57b0.slice/crio-af787c46e0626aa0096dffeaff0b6d2420ba05110e101d4695fda46bc67c8427 WatchSource:0}: Error finding container af787c46e0626aa0096dffeaff0b6d2420ba05110e101d4695fda46bc67c8427: Status 404 returned error can't find the container with id af787c46e0626aa0096dffeaff0b6d2420ba05110e101d4695fda46bc67c8427 Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.482408 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4qprj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-44jhz_openstack-operators(48d26de5-4809-4a61-82c3-03cbf56c57b0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.483602 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz" podUID="48d26de5-4809-4a61-82c3-03cbf56c57b0" Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.553501 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert\") pod \"infra-operator-controller-manager-7b9c774f96-njktv\" (UID: \"c1f29dbe-e3d8-4dc0-aafe-fcd1de367544\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.553669 4742 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.553756 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert podName:c1f29dbe-e3d8-4dc0-aafe-fcd1de367544 nodeName:}" failed. No retries permitted until 2026-03-17 11:29:46.553734861 +0000 UTC m=+1089.679862619 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert") pod "infra-operator-controller-manager-7b9c774f96-njktv" (UID: "c1f29dbe-e3d8-4dc0-aafe-fcd1de367544") : secret "infra-operator-webhook-server-cert" not found Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.584084 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4"] Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.586192 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fg2r7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-fhqr4_openstack-operators(b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.587987 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4" podUID="b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee" Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.588096 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8"] Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.590476 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vkm2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-fh8v8_openstack-operators(53837a21-9249-4ff8-aa95-bdfbb6d49f33): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.597629 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4"] Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.597653 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8" podUID="53837a21-9249-4ff8-aa95-bdfbb6d49f33" Mar 17 11:29:44 crc kubenswrapper[4742]: W0317 11:29:44.609165 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eaedaeb_8d0d_4fde_8b74_cdd689d56123.slice/crio-c3b39cc40e814a0dfc34343296ef9eb9cc9fe807e1e0041675d9da48576099cb WatchSource:0}: Error finding container c3b39cc40e814a0dfc34343296ef9eb9cc9fe807e1e0041675d9da48576099cb: Status 404 returned error can't find the container with id c3b39cc40e814a0dfc34343296ef9eb9cc9fe807e1e0041675d9da48576099cb Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.611727 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b8ftm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-rf6p4_openstack-operators(0eaedaeb-8d0d-4fde-8b74-cdd689d56123): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.613116 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4" podUID="0eaedaeb-8d0d-4fde-8b74-cdd689d56123" Mar 17 11:29:44 crc kubenswrapper[4742]: I0317 11:29:44.762816 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-89w9s\" (UID: \"7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.763266 4742 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 11:29:44 crc kubenswrapper[4742]: E0317 11:29:44.763376 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert podName:7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0 nodeName:}" failed. No retries permitted until 2026-03-17 11:29:46.763315523 +0000 UTC m=+1089.889443301 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-89w9s" (UID: "7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 11:29:45 crc kubenswrapper[4742]: I0317 11:29:45.385874 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8" event={"ID":"53837a21-9249-4ff8-aa95-bdfbb6d49f33","Type":"ContainerStarted","Data":"e26b76cafe7a19be4b497b8b64a5ec3d8eeb83c5f50fb42ab80d32f86290c39d"} Mar 17 11:29:45 crc kubenswrapper[4742]: I0317 11:29:45.387464 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zvlv9" event={"ID":"9470c17e-90c4-4723-b3ef-af8ec6f1edc2","Type":"ContainerStarted","Data":"acf478818afb50c9c5d579dd51d6ff6d1b8a5e332f7bd06d6b0ae4a9128f0e72"} Mar 17 11:29:45 crc kubenswrapper[4742]: E0317 11:29:45.388790 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8" podUID="53837a21-9249-4ff8-aa95-bdfbb6d49f33" Mar 17 11:29:45 crc kubenswrapper[4742]: I0317 11:29:45.389225 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m" event={"ID":"5e3c7784-527e-4f97-b035-240b7014241f","Type":"ContainerStarted","Data":"6ae4bd731a254826a16ed7aafbb98d14385790cee68c746a32d07330085eda49"} Mar 17 11:29:45 crc kubenswrapper[4742]: E0317 11:29:45.390184 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m" podUID="5e3c7784-527e-4f97-b035-240b7014241f" Mar 17 11:29:45 crc kubenswrapper[4742]: I0317 11:29:45.391026 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4" event={"ID":"0eaedaeb-8d0d-4fde-8b74-cdd689d56123","Type":"ContainerStarted","Data":"c3b39cc40e814a0dfc34343296ef9eb9cc9fe807e1e0041675d9da48576099cb"} Mar 17 11:29:45 crc kubenswrapper[4742]: E0317 11:29:45.391743 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4" podUID="0eaedaeb-8d0d-4fde-8b74-cdd689d56123" Mar 17 11:29:45 crc kubenswrapper[4742]: I0317 11:29:45.392673 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:45 crc kubenswrapper[4742]: I0317 11:29:45.392710 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:45 crc kubenswrapper[4742]: I0317 11:29:45.393973 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl" event={"ID":"f42b3e9f-55a9-47fe-a5b8-51b36d622657","Type":"ContainerStarted","Data":"b8cd06ca978a2037e15bcb702476775df3915f2106a734c8e05fb8bc04fc2be8"} Mar 17 11:29:45 crc kubenswrapper[4742]: E0317 11:29:45.394032 4742 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 11:29:45 crc kubenswrapper[4742]: E0317 11:29:45.394093 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs podName:7d6829e2-3788-4653-91e4-bff007a7bb5d nodeName:}" failed. No retries permitted until 2026-03-17 11:29:47.394074995 +0000 UTC m=+1090.520202753 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs") pod "openstack-operator-controller-manager-c748c4754-6hffs" (UID: "7d6829e2-3788-4653-91e4-bff007a7bb5d") : secret "metrics-server-cert" not found Mar 17 11:29:45 crc kubenswrapper[4742]: E0317 11:29:45.394149 4742 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 11:29:45 crc kubenswrapper[4742]: E0317 11:29:45.394234 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs podName:7d6829e2-3788-4653-91e4-bff007a7bb5d nodeName:}" failed. No retries permitted until 2026-03-17 11:29:47.394209398 +0000 UTC m=+1090.520337156 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs") pod "openstack-operator-controller-manager-c748c4754-6hffs" (UID: "7d6829e2-3788-4653-91e4-bff007a7bb5d") : secret "webhook-server-cert" not found Mar 17 11:29:45 crc kubenswrapper[4742]: E0317 11:29:45.395143 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl" podUID="f42b3e9f-55a9-47fe-a5b8-51b36d622657" Mar 17 11:29:45 crc kubenswrapper[4742]: I0317 11:29:45.395767 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4" event={"ID":"b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee","Type":"ContainerStarted","Data":"29ba4cec216b94f89f8c056c7f45322372b39426c435ad8aa9fd63b37fd52301"} Mar 17 11:29:45 crc kubenswrapper[4742]: E0317 11:29:45.397850 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4" podUID="b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee" Mar 17 11:29:45 crc kubenswrapper[4742]: I0317 11:29:45.400188 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz" event={"ID":"48d26de5-4809-4a61-82c3-03cbf56c57b0","Type":"ContainerStarted","Data":"af787c46e0626aa0096dffeaff0b6d2420ba05110e101d4695fda46bc67c8427"} Mar 17 11:29:45 crc kubenswrapper[4742]: E0317 11:29:45.404561 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz" podUID="48d26de5-4809-4a61-82c3-03cbf56c57b0" Mar 17 11:29:45 crc kubenswrapper[4742]: I0317 11:29:45.404571 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-g252s" event={"ID":"0436441e-c132-4c65-aee5-8b20461c12e1","Type":"ContainerStarted","Data":"5468b0a2d2880d3e389a2688a7c5ac5bb88383d202c65f11008c09a8aa69d89f"} Mar 17 11:29:45 crc kubenswrapper[4742]: E0317 11:29:45.405426 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-g252s" podUID="0436441e-c132-4c65-aee5-8b20461c12e1" Mar 17 11:29:46 crc kubenswrapper[4742]: E0317 11:29:46.423369 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz" podUID="48d26de5-4809-4a61-82c3-03cbf56c57b0" Mar 17 11:29:46 crc kubenswrapper[4742]: E0317 11:29:46.423368 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl" podUID="f42b3e9f-55a9-47fe-a5b8-51b36d622657" Mar 17 11:29:46 crc kubenswrapper[4742]: E0317 11:29:46.423384 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4" podUID="b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee" Mar 17 11:29:46 crc kubenswrapper[4742]: E0317 11:29:46.423430 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8" podUID="53837a21-9249-4ff8-aa95-bdfbb6d49f33" Mar 17 11:29:46 crc kubenswrapper[4742]: E0317 11:29:46.423697 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4" podUID="0eaedaeb-8d0d-4fde-8b74-cdd689d56123" Mar 17 11:29:46 crc kubenswrapper[4742]: E0317 11:29:46.423713 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-g252s" podUID="0436441e-c132-4c65-aee5-8b20461c12e1" Mar 17 11:29:46 crc kubenswrapper[4742]: E0317 11:29:46.423820 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m" podUID="5e3c7784-527e-4f97-b035-240b7014241f" Mar 17 11:29:46 crc kubenswrapper[4742]: I0317 11:29:46.642894 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert\") pod \"infra-operator-controller-manager-7b9c774f96-njktv\" (UID: \"c1f29dbe-e3d8-4dc0-aafe-fcd1de367544\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:29:46 crc kubenswrapper[4742]: E0317 11:29:46.643035 4742 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 11:29:46 crc kubenswrapper[4742]: E0317 11:29:46.643106 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert podName:c1f29dbe-e3d8-4dc0-aafe-fcd1de367544 nodeName:}" failed. No retries permitted until 2026-03-17 11:29:50.64308745 +0000 UTC m=+1093.769215208 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert") pod "infra-operator-controller-manager-7b9c774f96-njktv" (UID: "c1f29dbe-e3d8-4dc0-aafe-fcd1de367544") : secret "infra-operator-webhook-server-cert" not found Mar 17 11:29:46 crc kubenswrapper[4742]: I0317 11:29:46.858718 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-89w9s\" (UID: \"7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:29:46 crc kubenswrapper[4742]: E0317 11:29:46.858877 4742 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 11:29:46 crc kubenswrapper[4742]: E0317 11:29:46.858970 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert podName:7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0 nodeName:}" failed. No retries permitted until 2026-03-17 11:29:50.858948286 +0000 UTC m=+1093.985076054 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-89w9s" (UID: "7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 11:29:47 crc kubenswrapper[4742]: I0317 11:29:47.470724 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:47 crc kubenswrapper[4742]: I0317 11:29:47.471204 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:47 crc kubenswrapper[4742]: E0317 11:29:47.470987 4742 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 11:29:47 crc kubenswrapper[4742]: E0317 11:29:47.471316 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs podName:7d6829e2-3788-4653-91e4-bff007a7bb5d nodeName:}" failed. No retries permitted until 2026-03-17 11:29:51.471292315 +0000 UTC m=+1094.597420083 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs") pod "openstack-operator-controller-manager-c748c4754-6hffs" (UID: "7d6829e2-3788-4653-91e4-bff007a7bb5d") : secret "metrics-server-cert" not found Mar 17 11:29:47 crc kubenswrapper[4742]: E0317 11:29:47.471440 4742 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 11:29:47 crc kubenswrapper[4742]: E0317 11:29:47.471475 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs podName:7d6829e2-3788-4653-91e4-bff007a7bb5d nodeName:}" failed. No retries permitted until 2026-03-17 11:29:51.471465119 +0000 UTC m=+1094.597592887 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs") pod "openstack-operator-controller-manager-c748c4754-6hffs" (UID: "7d6829e2-3788-4653-91e4-bff007a7bb5d") : secret "webhook-server-cert" not found Mar 17 11:29:50 crc kubenswrapper[4742]: I0317 11:29:50.720837 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert\") pod \"infra-operator-controller-manager-7b9c774f96-njktv\" (UID: \"c1f29dbe-e3d8-4dc0-aafe-fcd1de367544\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:29:50 crc kubenswrapper[4742]: E0317 11:29:50.721179 4742 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 17 11:29:50 crc kubenswrapper[4742]: E0317 11:29:50.721268 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert podName:c1f29dbe-e3d8-4dc0-aafe-fcd1de367544 nodeName:}" failed. No retries permitted until 2026-03-17 11:29:58.721244901 +0000 UTC m=+1101.847372679 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert") pod "infra-operator-controller-manager-7b9c774f96-njktv" (UID: "c1f29dbe-e3d8-4dc0-aafe-fcd1de367544") : secret "infra-operator-webhook-server-cert" not found Mar 17 11:29:50 crc kubenswrapper[4742]: I0317 11:29:50.923363 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-89w9s\" (UID: \"7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:29:50 crc kubenswrapper[4742]: E0317 11:29:50.923665 4742 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 11:29:50 crc kubenswrapper[4742]: E0317 11:29:50.923735 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert podName:7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0 nodeName:}" failed. No retries permitted until 2026-03-17 11:29:58.923715206 +0000 UTC m=+1102.049842964 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-89w9s" (UID: "7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 17 11:29:51 crc kubenswrapper[4742]: I0317 11:29:51.531620 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:51 crc kubenswrapper[4742]: I0317 11:29:51.531674 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:51 crc kubenswrapper[4742]: E0317 11:29:51.531823 4742 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 11:29:51 crc kubenswrapper[4742]: E0317 11:29:51.531824 4742 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 17 11:29:51 crc kubenswrapper[4742]: E0317 11:29:51.531877 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs podName:7d6829e2-3788-4653-91e4-bff007a7bb5d nodeName:}" failed. No retries permitted until 2026-03-17 11:29:59.531862378 +0000 UTC m=+1102.657990136 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs") pod "openstack-operator-controller-manager-c748c4754-6hffs" (UID: "7d6829e2-3788-4653-91e4-bff007a7bb5d") : secret "metrics-server-cert" not found Mar 17 11:29:51 crc kubenswrapper[4742]: E0317 11:29:51.531899 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs podName:7d6829e2-3788-4653-91e4-bff007a7bb5d nodeName:}" failed. No retries permitted until 2026-03-17 11:29:59.531894299 +0000 UTC m=+1102.658022057 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs") pod "openstack-operator-controller-manager-c748c4754-6hffs" (UID: "7d6829e2-3788-4653-91e4-bff007a7bb5d") : secret "webhook-server-cert" not found Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.475379 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zvlv9" event={"ID":"9470c17e-90c4-4723-b3ef-af8ec6f1edc2","Type":"ContainerStarted","Data":"addb9f57e15400def1dc860f8072e4a32a81af99ebebc4e89710941dc4f7b596"} Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.476040 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zvlv9" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.476704 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j5sfj" event={"ID":"01ae7820-ca74-4237-ac4a-82b3605f2306","Type":"ContainerStarted","Data":"830be92dfeb4dcffd812bda5cddac0b272451268d80cc8e11984ecd29602172f"} Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.476808 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j5sfj" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.478228 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq2xc" event={"ID":"9b21605a-2c83-49df-ae0f-dfb172a1b9f5","Type":"ContainerStarted","Data":"a14d6e5cb92d01ffa08ea22345c6c3330bf35e03a7bc530acd4f65d989b19487"} Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.478365 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq2xc" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.479790 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xjp4g" event={"ID":"f91fb07a-de67-44ff-b6af-446891941a60","Type":"ContainerStarted","Data":"a398ef621a26b1e03bd4c7d7d79d241bdf7a0d72e89ebb8e51e4f7df0cff764e"} Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.479948 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xjp4g" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.481275 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-7ttcf" event={"ID":"46b5befe-2274-4bc8-a2c4-ce8a9fc915ae","Type":"ContainerStarted","Data":"7895fd878c1037c1efaf923d9fc58af67851bba2ede127d72c3fb5d2e59944e3"} Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.481387 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-7ttcf" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.482765 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4mj6d" event={"ID":"1cdb0787-4a2a-41f6-aed0-8693b2669444","Type":"ContainerStarted","Data":"2a19d7a4fef630d738e89399fe44c2f662cc6f9df285b0a67c7feae9aa467cb4"} Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.482869 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4mj6d" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.484785 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4fvjv" event={"ID":"c59e15b4-2341-4b9e-8887-d6b1f594dc0e","Type":"ContainerStarted","Data":"19f3aa3dc8ee48175e974f471b9a341a20b12283ab7cbaf28ce37e0fa1668e53"} Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.485262 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4fvjv" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.486399 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dvbmd" event={"ID":"b3928371-ca20-41d9-8200-36410c2df752","Type":"ContainerStarted","Data":"2cb13208442c9f31342fcabeba0f575e7fa41c11919e113e201c8c1acdba8041"} Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.486832 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dvbmd" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.488196 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-znwjl" event={"ID":"c8ccb584-e9e1-4eba-827e-3e7197f3133f","Type":"ContainerStarted","Data":"0358c135fe58f9d7bc3c3ad163b713e42f363228155bcfe9253d002f2f529e55"} Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.488673 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-znwjl" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.490028 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-g729j" event={"ID":"45257cde-ca39-4e50-b465-b76ea15e179e","Type":"ContainerStarted","Data":"ee0c6b0303d6a5554e6ac9d1fcdefab815eff8d363fcc5a91de54f0cd271c46c"} Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.490608 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-g729j" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.492348 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4phr7" event={"ID":"27150936-220d-4247-b873-10add7124430","Type":"ContainerStarted","Data":"a84fbb26e12aa9a7c781d166b96976f8fe859e5d4c89d38d61d654ef4bb11fef"} Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.492894 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4phr7" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.494036 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vshmg" event={"ID":"88b49b71-3d6b-4ca0-8943-c0d0c10b9ff9","Type":"ContainerStarted","Data":"be1c6fb6c238526fecaa76f251d6f8fd2b1d81508ba67a01d0a9e1003fb2a141"} Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.494419 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vshmg" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.495732 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-6z2xv" event={"ID":"a7d611a7-9728-4738-8efa-80883aa13b2b","Type":"ContainerStarted","Data":"253abdcb19df489e61aeed2c2d395ed642864d3f74e73deb6a06021beb5953a7"} Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.496179 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-6z2xv" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.505643 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zvlv9" podStartSLOduration=3.091016417 podStartE2EDuration="13.505626551s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.464401631 +0000 UTC m=+1087.590529379" lastFinishedPulling="2026-03-17 11:29:54.879011755 +0000 UTC m=+1098.005139513" observedRunningTime="2026-03-17 11:29:55.500317154 +0000 UTC m=+1098.626444922" watchObservedRunningTime="2026-03-17 11:29:55.505626551 +0000 UTC m=+1098.631754309" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.595380 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-7ttcf" podStartSLOduration=3.014585983 podStartE2EDuration="13.595358973s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.265788104 +0000 UTC m=+1087.391915862" lastFinishedPulling="2026-03-17 11:29:54.846561094 +0000 UTC m=+1097.972688852" observedRunningTime="2026-03-17 11:29:55.589749067 +0000 UTC m=+1098.715876825" watchObservedRunningTime="2026-03-17 11:29:55.595358973 +0000 UTC m=+1098.721486731" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.595866 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4mj6d" podStartSLOduration=2.931778093 podStartE2EDuration="13.595860368s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.176998847 +0000 UTC m=+1087.303126605" lastFinishedPulling="2026-03-17 11:29:54.841081122 +0000 UTC m=+1097.967208880" observedRunningTime="2026-03-17 11:29:55.546744023 +0000 UTC m=+1098.672871801" watchObservedRunningTime="2026-03-17 11:29:55.595860368 +0000 UTC m=+1098.721988126" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.652117 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4phr7" podStartSLOduration=2.690283795 podStartE2EDuration="13.65210108s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:43.882801164 +0000 UTC m=+1087.008928922" lastFinishedPulling="2026-03-17 11:29:54.844618449 +0000 UTC m=+1097.970746207" observedRunningTime="2026-03-17 11:29:55.639280084 +0000 UTC m=+1098.765407842" watchObservedRunningTime="2026-03-17 11:29:55.65210108 +0000 UTC m=+1098.778228828" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.767739 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xjp4g" podStartSLOduration=3.187949019 podStartE2EDuration="13.767722292s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.263111359 +0000 UTC m=+1087.389239117" lastFinishedPulling="2026-03-17 11:29:54.842884592 +0000 UTC m=+1097.969012390" observedRunningTime="2026-03-17 11:29:55.714795611 +0000 UTC m=+1098.840923369" watchObservedRunningTime="2026-03-17 11:29:55.767722292 +0000 UTC m=+1098.893850050" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.767863 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j5sfj" podStartSLOduration=2.7660873009999998 podStartE2EDuration="13.767860426s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:43.838793242 +0000 UTC m=+1086.964921000" lastFinishedPulling="2026-03-17 11:29:54.840566367 +0000 UTC m=+1097.966694125" observedRunningTime="2026-03-17 11:29:55.753219978 +0000 UTC m=+1098.879347726" watchObservedRunningTime="2026-03-17 11:29:55.767860426 +0000 UTC m=+1098.893988184" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.793821 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-6z2xv" podStartSLOduration=2.964223314 podStartE2EDuration="13.793802716s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.011671044 +0000 UTC m=+1087.137798802" lastFinishedPulling="2026-03-17 11:29:54.841250446 +0000 UTC m=+1097.967378204" observedRunningTime="2026-03-17 11:29:55.789781694 +0000 UTC m=+1098.915909452" watchObservedRunningTime="2026-03-17 11:29:55.793802716 +0000 UTC m=+1098.919930474" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.825654 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dvbmd" podStartSLOduration=2.9961192690000003 podStartE2EDuration="13.82563564s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.171526844 +0000 UTC m=+1087.297654602" lastFinishedPulling="2026-03-17 11:29:55.001043185 +0000 UTC m=+1098.127170973" observedRunningTime="2026-03-17 11:29:55.821709971 +0000 UTC m=+1098.947837729" watchObservedRunningTime="2026-03-17 11:29:55.82563564 +0000 UTC m=+1098.951763398" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.865126 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq2xc" podStartSLOduration=3.034771843 podStartE2EDuration="13.865105616s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.014617686 +0000 UTC m=+1087.140745444" lastFinishedPulling="2026-03-17 11:29:54.844951459 +0000 UTC m=+1097.971079217" observedRunningTime="2026-03-17 11:29:55.851463917 +0000 UTC m=+1098.977591685" watchObservedRunningTime="2026-03-17 11:29:55.865105616 +0000 UTC m=+1098.991233384" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.890054 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-g729j" podStartSLOduration=2.930728573 podStartE2EDuration="13.890025919s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:43.881747195 +0000 UTC m=+1087.007874953" lastFinishedPulling="2026-03-17 11:29:54.841044511 +0000 UTC m=+1097.967172299" observedRunningTime="2026-03-17 11:29:55.886839771 +0000 UTC m=+1099.012967539" watchObservedRunningTime="2026-03-17 11:29:55.890025919 +0000 UTC m=+1099.016153677" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.992839 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4fvjv" podStartSLOduration=3.417745542 podStartE2EDuration="13.992822214s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.265838525 +0000 UTC m=+1087.391966283" lastFinishedPulling="2026-03-17 11:29:54.840915197 +0000 UTC m=+1097.967042955" observedRunningTime="2026-03-17 11:29:55.955984371 +0000 UTC m=+1099.082112139" watchObservedRunningTime="2026-03-17 11:29:55.992822214 +0000 UTC m=+1099.118949962" Mar 17 11:29:55 crc kubenswrapper[4742]: I0317 11:29:55.996151 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-znwjl" podStartSLOduration=3.196380013 podStartE2EDuration="13.996146877s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.077685998 +0000 UTC m=+1087.203813756" lastFinishedPulling="2026-03-17 11:29:54.877452862 +0000 UTC m=+1098.003580620" observedRunningTime="2026-03-17 11:29:55.98835444 +0000 UTC m=+1099.114482198" watchObservedRunningTime="2026-03-17 11:29:55.996146877 +0000 UTC m=+1099.122274635" Mar 17 11:29:56 crc kubenswrapper[4742]: I0317 11:29:56.037408 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vshmg" podStartSLOduration=3.473044138 podStartE2EDuration="14.037391302s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.280914014 +0000 UTC m=+1087.407041772" lastFinishedPulling="2026-03-17 11:29:54.845261188 +0000 UTC m=+1097.971388936" observedRunningTime="2026-03-17 11:29:56.031276793 +0000 UTC m=+1099.157404551" watchObservedRunningTime="2026-03-17 11:29:56.037391302 +0000 UTC m=+1099.163519060" Mar 17 11:29:58 crc kubenswrapper[4742]: I0317 11:29:58.754203 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert\") pod \"infra-operator-controller-manager-7b9c774f96-njktv\" (UID: \"c1f29dbe-e3d8-4dc0-aafe-fcd1de367544\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:29:58 crc kubenswrapper[4742]: I0317 11:29:58.775603 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1f29dbe-e3d8-4dc0-aafe-fcd1de367544-cert\") pod \"infra-operator-controller-manager-7b9c774f96-njktv\" (UID: \"c1f29dbe-e3d8-4dc0-aafe-fcd1de367544\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:29:58 crc kubenswrapper[4742]: I0317 11:29:58.959145 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-89w9s\" (UID: \"7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:29:58 crc kubenswrapper[4742]: I0317 11:29:58.963897 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-89w9s\" (UID: \"7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:29:59 crc kubenswrapper[4742]: I0317 11:29:59.044421 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:29:59 crc kubenswrapper[4742]: I0317 11:29:59.078959 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:29:59 crc kubenswrapper[4742]: I0317 11:29:59.356432 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv"] Mar 17 11:29:59 crc kubenswrapper[4742]: W0317 11:29:59.360880 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1f29dbe_e3d8_4dc0_aafe_fcd1de367544.slice/crio-e0df3e0e148df6e28364e4f70ce3f830a412c7a7634c45215acd1333c4095972 WatchSource:0}: Error finding container e0df3e0e148df6e28364e4f70ce3f830a412c7a7634c45215acd1333c4095972: Status 404 returned error can't find the container with id e0df3e0e148df6e28364e4f70ce3f830a412c7a7634c45215acd1333c4095972 Mar 17 11:29:59 crc kubenswrapper[4742]: I0317 11:29:59.525296 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" event={"ID":"c1f29dbe-e3d8-4dc0-aafe-fcd1de367544","Type":"ContainerStarted","Data":"e0df3e0e148df6e28364e4f70ce3f830a412c7a7634c45215acd1333c4095972"} Mar 17 11:29:59 crc kubenswrapper[4742]: I0317 11:29:59.567781 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:59 crc kubenswrapper[4742]: I0317 11:29:59.567867 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:59 crc kubenswrapper[4742]: E0317 11:29:59.568115 4742 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 17 11:29:59 crc kubenswrapper[4742]: E0317 11:29:59.568206 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs podName:7d6829e2-3788-4653-91e4-bff007a7bb5d nodeName:}" failed. No retries permitted until 2026-03-17 11:30:15.568182562 +0000 UTC m=+1118.694310350 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs") pod "openstack-operator-controller-manager-c748c4754-6hffs" (UID: "7d6829e2-3788-4653-91e4-bff007a7bb5d") : secret "webhook-server-cert" not found Mar 17 11:29:59 crc kubenswrapper[4742]: I0317 11:29:59.575744 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-metrics-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:29:59 crc kubenswrapper[4742]: I0317 11:29:59.622637 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s"] Mar 17 11:29:59 crc kubenswrapper[4742]: W0317 11:29:59.632265 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a86c487_f0e3_40bf_a1fe_e70e97a0d8c0.slice/crio-6d3e6f8cefb0d049d80f93f6c53c0098506032dca69a303c8b8093cf9cdb8598 WatchSource:0}: Error finding container 6d3e6f8cefb0d049d80f93f6c53c0098506032dca69a303c8b8093cf9cdb8598: Status 404 returned error can't find the container with id 6d3e6f8cefb0d049d80f93f6c53c0098506032dca69a303c8b8093cf9cdb8598 Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.131036 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562450-jzztm"] Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.132113 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562450-jzztm" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.134284 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.135130 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.135307 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.148479 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562450-jzztm"] Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.181619 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvf84\" (UniqueName: \"kubernetes.io/projected/677f677c-39b4-4713-afc0-57fb6b36a1a3-kube-api-access-bvf84\") pod \"auto-csr-approver-29562450-jzztm\" (UID: \"677f677c-39b4-4713-afc0-57fb6b36a1a3\") " pod="openshift-infra/auto-csr-approver-29562450-jzztm" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.234022 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v"] Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.238599 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.242863 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.243193 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.244851 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v"] Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.282597 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcjwf\" (UniqueName: \"kubernetes.io/projected/7057d36e-e38b-41f9-98f1-7f136f859aec-kube-api-access-bcjwf\") pod \"collect-profiles-29562450-5zd5v\" (UID: \"7057d36e-e38b-41f9-98f1-7f136f859aec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.282704 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7057d36e-e38b-41f9-98f1-7f136f859aec-config-volume\") pod \"collect-profiles-29562450-5zd5v\" (UID: \"7057d36e-e38b-41f9-98f1-7f136f859aec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.282737 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7057d36e-e38b-41f9-98f1-7f136f859aec-secret-volume\") pod \"collect-profiles-29562450-5zd5v\" (UID: \"7057d36e-e38b-41f9-98f1-7f136f859aec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.282760 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvf84\" (UniqueName: \"kubernetes.io/projected/677f677c-39b4-4713-afc0-57fb6b36a1a3-kube-api-access-bvf84\") pod \"auto-csr-approver-29562450-jzztm\" (UID: \"677f677c-39b4-4713-afc0-57fb6b36a1a3\") " pod="openshift-infra/auto-csr-approver-29562450-jzztm" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.298884 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvf84\" (UniqueName: \"kubernetes.io/projected/677f677c-39b4-4713-afc0-57fb6b36a1a3-kube-api-access-bvf84\") pod \"auto-csr-approver-29562450-jzztm\" (UID: \"677f677c-39b4-4713-afc0-57fb6b36a1a3\") " pod="openshift-infra/auto-csr-approver-29562450-jzztm" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.384075 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7057d36e-e38b-41f9-98f1-7f136f859aec-config-volume\") pod \"collect-profiles-29562450-5zd5v\" (UID: \"7057d36e-e38b-41f9-98f1-7f136f859aec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.384394 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7057d36e-e38b-41f9-98f1-7f136f859aec-secret-volume\") pod \"collect-profiles-29562450-5zd5v\" (UID: \"7057d36e-e38b-41f9-98f1-7f136f859aec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.384428 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcjwf\" (UniqueName: \"kubernetes.io/projected/7057d36e-e38b-41f9-98f1-7f136f859aec-kube-api-access-bcjwf\") pod \"collect-profiles-29562450-5zd5v\" (UID: \"7057d36e-e38b-41f9-98f1-7f136f859aec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.385576 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7057d36e-e38b-41f9-98f1-7f136f859aec-config-volume\") pod \"collect-profiles-29562450-5zd5v\" (UID: \"7057d36e-e38b-41f9-98f1-7f136f859aec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.390674 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7057d36e-e38b-41f9-98f1-7f136f859aec-secret-volume\") pod \"collect-profiles-29562450-5zd5v\" (UID: \"7057d36e-e38b-41f9-98f1-7f136f859aec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.400711 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcjwf\" (UniqueName: \"kubernetes.io/projected/7057d36e-e38b-41f9-98f1-7f136f859aec-kube-api-access-bcjwf\") pod \"collect-profiles-29562450-5zd5v\" (UID: \"7057d36e-e38b-41f9-98f1-7f136f859aec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.455657 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562450-jzztm" Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.531964 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" event={"ID":"7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0","Type":"ContainerStarted","Data":"6d3e6f8cefb0d049d80f93f6c53c0098506032dca69a303c8b8093cf9cdb8598"} Mar 17 11:30:00 crc kubenswrapper[4742]: I0317 11:30:00.556942 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" Mar 17 11:30:02 crc kubenswrapper[4742]: I0317 11:30:02.118515 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562450-jzztm"] Mar 17 11:30:02 crc kubenswrapper[4742]: I0317 11:30:02.177204 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v"] Mar 17 11:30:02 crc kubenswrapper[4742]: W0317 11:30:02.383625 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7057d36e_e38b_41f9_98f1_7f136f859aec.slice/crio-4497c2f2c083d7249332f3d18b93aef00b201713775000d214dec04109b406c4 WatchSource:0}: Error finding container 4497c2f2c083d7249332f3d18b93aef00b201713775000d214dec04109b406c4: Status 404 returned error can't find the container with id 4497c2f2c083d7249332f3d18b93aef00b201713775000d214dec04109b406c4 Mar 17 11:30:02 crc kubenswrapper[4742]: I0317 11:30:02.555153 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562450-jzztm" event={"ID":"677f677c-39b4-4713-afc0-57fb6b36a1a3","Type":"ContainerStarted","Data":"846430701c1b841d89eb0745403cd6e20ed1d8726bb3f9ee12abf6533e3ec22e"} Mar 17 11:30:02 crc kubenswrapper[4742]: I0317 11:30:02.556078 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" event={"ID":"7057d36e-e38b-41f9-98f1-7f136f859aec","Type":"ContainerStarted","Data":"4497c2f2c083d7249332f3d18b93aef00b201713775000d214dec04109b406c4"} Mar 17 11:30:02 crc kubenswrapper[4742]: I0317 11:30:02.980588 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-g729j" Mar 17 11:30:03 crc kubenswrapper[4742]: I0317 11:30:03.003829 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4phr7" Mar 17 11:30:03 crc kubenswrapper[4742]: I0317 11:30:03.049624 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j5sfj" Mar 17 11:30:03 crc kubenswrapper[4742]: I0317 11:30:03.062831 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-sq2xc" Mar 17 11:30:03 crc kubenswrapper[4742]: I0317 11:30:03.086710 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-6z2xv" Mar 17 11:30:03 crc kubenswrapper[4742]: I0317 11:30:03.101309 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-znwjl" Mar 17 11:30:03 crc kubenswrapper[4742]: I0317 11:30:03.270915 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-4mj6d" Mar 17 11:30:03 crc kubenswrapper[4742]: I0317 11:30:03.318460 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dvbmd" Mar 17 11:30:03 crc kubenswrapper[4742]: I0317 11:30:03.398980 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-xjp4g" Mar 17 11:30:03 crc kubenswrapper[4742]: I0317 11:30:03.399871 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-7ttcf" Mar 17 11:30:03 crc kubenswrapper[4742]: I0317 11:30:03.456227 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-vshmg" Mar 17 11:30:03 crc kubenswrapper[4742]: I0317 11:30:03.460343 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4fvjv" Mar 17 11:30:03 crc kubenswrapper[4742]: I0317 11:30:03.814866 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-zvlv9" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.666021 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8" event={"ID":"53837a21-9249-4ff8-aa95-bdfbb6d49f33","Type":"ContainerStarted","Data":"0b041f6bb2f26e39c517d88c47fe93f6aedceef1311bb79d69dbf9b6e5202391"} Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.666704 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.670684 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" event={"ID":"7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0","Type":"ContainerStarted","Data":"facde8d1ffdcee249eab8cf8f1c89972b15b17171d7ec7ca8b6b0601e1682df6"} Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.670751 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.675096 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m" event={"ID":"5e3c7784-527e-4f97-b035-240b7014241f","Type":"ContainerStarted","Data":"01f3f25901ce528374e86eb802b0d56274f2f2a7de2835bb84f555f3dc391f36"} Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.675404 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.676874 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4" event={"ID":"0eaedaeb-8d0d-4fde-8b74-cdd689d56123","Type":"ContainerStarted","Data":"a723c8f040a0d205578b04f9338644f36bc683fce80eb55107f7eaa4aa3f9417"} Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.677612 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.688864 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4" event={"ID":"b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee","Type":"ContainerStarted","Data":"2694456c540c9ca31a160588c2415a29dcdc77509eeb6d5cf1dc164546ca442f"} Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.690049 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.693566 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz" event={"ID":"48d26de5-4809-4a61-82c3-03cbf56c57b0","Type":"ContainerStarted","Data":"c2f3a8edf0570000b05390cd0da59576778510257c01ac866e8281cfa12b2b9d"} Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.696454 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-g252s" event={"ID":"0436441e-c132-4c65-aee5-8b20461c12e1","Type":"ContainerStarted","Data":"005638edeaa7a72f5c9dcbdc83658a3d3973992f09ad76ddc5c69ef798d8f9bc"} Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.697131 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-g252s" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.699508 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" event={"ID":"7057d36e-e38b-41f9-98f1-7f136f859aec","Type":"ContainerStarted","Data":"93817ed3cdb466999dc3b44859ea51e8c6851d26ddf4dca3d5259a85d5219e31"} Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.713592 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" event={"ID":"c1f29dbe-e3d8-4dc0-aafe-fcd1de367544","Type":"ContainerStarted","Data":"81f104d6f0c3634f6e28e3da9be4e194337f4da4477d1dd1aeabc24649843668"} Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.713971 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.733516 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl" event={"ID":"f42b3e9f-55a9-47fe-a5b8-51b36d622657","Type":"ContainerStarted","Data":"1e621b2e421b3757773675e899a21354f106589758e82e0463fdf29969c25968"} Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.734265 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.735193 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" podStartSLOduration=18.118768965 podStartE2EDuration="31.735178483s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:59.63481294 +0000 UTC m=+1102.760940698" lastFinishedPulling="2026-03-17 11:30:13.251222448 +0000 UTC m=+1116.377350216" observedRunningTime="2026-03-17 11:30:13.724945901 +0000 UTC m=+1116.851073659" watchObservedRunningTime="2026-03-17 11:30:13.735178483 +0000 UTC m=+1116.861306241" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.736897 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8" podStartSLOduration=13.951330144 podStartE2EDuration="31.736889208s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.590248136 +0000 UTC m=+1087.716375894" lastFinishedPulling="2026-03-17 11:30:02.3758072 +0000 UTC m=+1105.501934958" observedRunningTime="2026-03-17 11:30:13.697317358 +0000 UTC m=+1116.823445106" watchObservedRunningTime="2026-03-17 11:30:13.736889208 +0000 UTC m=+1116.863016956" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.752920 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4" podStartSLOduration=2.087736226 podStartE2EDuration="30.752886403s" podCreationTimestamp="2026-03-17 11:29:43 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.586073991 +0000 UTC m=+1087.712201749" lastFinishedPulling="2026-03-17 11:30:13.251224158 +0000 UTC m=+1116.377351926" observedRunningTime="2026-03-17 11:30:13.74034025 +0000 UTC m=+1116.866468018" watchObservedRunningTime="2026-03-17 11:30:13.752886403 +0000 UTC m=+1116.879014161" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.793631 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-44jhz" podStartSLOduration=1.930467362 podStartE2EDuration="30.793613334s" podCreationTimestamp="2026-03-17 11:29:43 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.482294817 +0000 UTC m=+1087.608422575" lastFinishedPulling="2026-03-17 11:30:13.345440789 +0000 UTC m=+1116.471568547" observedRunningTime="2026-03-17 11:30:13.793586493 +0000 UTC m=+1116.919714261" watchObservedRunningTime="2026-03-17 11:30:13.793613334 +0000 UTC m=+1116.919741102" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.796831 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4" podStartSLOduration=9.601494261 podStartE2EDuration="30.796822189s" podCreationTimestamp="2026-03-17 11:29:43 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.611590329 +0000 UTC m=+1087.737718087" lastFinishedPulling="2026-03-17 11:30:05.806918257 +0000 UTC m=+1108.933046015" observedRunningTime="2026-03-17 11:30:13.769067042 +0000 UTC m=+1116.895194800" watchObservedRunningTime="2026-03-17 11:30:13.796822189 +0000 UTC m=+1116.922949947" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.810875 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-g252s" podStartSLOduration=2.867512913 podStartE2EDuration="31.810860201s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.362789718 +0000 UTC m=+1087.488917466" lastFinishedPulling="2026-03-17 11:30:13.306136996 +0000 UTC m=+1116.432264754" observedRunningTime="2026-03-17 11:30:13.806173917 +0000 UTC m=+1116.932301685" watchObservedRunningTime="2026-03-17 11:30:13.810860201 +0000 UTC m=+1116.936987959" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.827278 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m" podStartSLOduration=3.052258157 podStartE2EDuration="31.827262437s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.476304021 +0000 UTC m=+1087.602431779" lastFinishedPulling="2026-03-17 11:30:13.251308271 +0000 UTC m=+1116.377436059" observedRunningTime="2026-03-17 11:30:13.823550768 +0000 UTC m=+1116.949678526" watchObservedRunningTime="2026-03-17 11:30:13.827262437 +0000 UTC m=+1116.953390185" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.845849 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" podStartSLOduration=13.84583397 podStartE2EDuration="13.84583397s" podCreationTimestamp="2026-03-17 11:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:30:13.84393521 +0000 UTC m=+1116.970062968" watchObservedRunningTime="2026-03-17 11:30:13.84583397 +0000 UTC m=+1116.971961728" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.888348 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" podStartSLOduration=27.482803261 podStartE2EDuration="31.888324278s" podCreationTimestamp="2026-03-17 11:29:42 +0000 UTC" firstStartedPulling="2026-03-17 11:29:59.363325756 +0000 UTC m=+1102.489453514" lastFinishedPulling="2026-03-17 11:30:03.768846753 +0000 UTC m=+1106.894974531" observedRunningTime="2026-03-17 11:30:13.886783106 +0000 UTC m=+1117.012910874" watchObservedRunningTime="2026-03-17 11:30:13.888324278 +0000 UTC m=+1117.014452036" Mar 17 11:30:13 crc kubenswrapper[4742]: I0317 11:30:13.889668 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl" podStartSLOduration=1.933454451 podStartE2EDuration="30.889660773s" podCreationTimestamp="2026-03-17 11:29:43 +0000 UTC" firstStartedPulling="2026-03-17 11:29:44.368343292 +0000 UTC m=+1087.494471040" lastFinishedPulling="2026-03-17 11:30:13.324549564 +0000 UTC m=+1116.450677362" observedRunningTime="2026-03-17 11:30:13.873926235 +0000 UTC m=+1117.000053983" watchObservedRunningTime="2026-03-17 11:30:13.889660773 +0000 UTC m=+1117.015788531" Mar 17 11:30:14 crc kubenswrapper[4742]: I0317 11:30:14.745361 4742 generic.go:334] "Generic (PLEG): container finished" podID="677f677c-39b4-4713-afc0-57fb6b36a1a3" containerID="ece0f4a648dedc1d926b58223b995f25c2021607970bd4c3840ad15871418b48" exitCode=0 Mar 17 11:30:14 crc kubenswrapper[4742]: I0317 11:30:14.745433 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562450-jzztm" event={"ID":"677f677c-39b4-4713-afc0-57fb6b36a1a3","Type":"ContainerDied","Data":"ece0f4a648dedc1d926b58223b995f25c2021607970bd4c3840ad15871418b48"} Mar 17 11:30:14 crc kubenswrapper[4742]: I0317 11:30:14.749210 4742 generic.go:334] "Generic (PLEG): container finished" podID="7057d36e-e38b-41f9-98f1-7f136f859aec" containerID="93817ed3cdb466999dc3b44859ea51e8c6851d26ddf4dca3d5259a85d5219e31" exitCode=0 Mar 17 11:30:14 crc kubenswrapper[4742]: I0317 11:30:14.749329 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" event={"ID":"7057d36e-e38b-41f9-98f1-7f136f859aec","Type":"ContainerDied","Data":"93817ed3cdb466999dc3b44859ea51e8c6851d26ddf4dca3d5259a85d5219e31"} Mar 17 11:30:15 crc kubenswrapper[4742]: I0317 11:30:15.613234 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:30:15 crc kubenswrapper[4742]: I0317 11:30:15.625325 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7d6829e2-3788-4653-91e4-bff007a7bb5d-webhook-certs\") pod \"openstack-operator-controller-manager-c748c4754-6hffs\" (UID: \"7d6829e2-3788-4653-91e4-bff007a7bb5d\") " pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:30:15 crc kubenswrapper[4742]: I0317 11:30:15.770367 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.075039 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562450-jzztm" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.078225 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.221639 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcjwf\" (UniqueName: \"kubernetes.io/projected/7057d36e-e38b-41f9-98f1-7f136f859aec-kube-api-access-bcjwf\") pod \"7057d36e-e38b-41f9-98f1-7f136f859aec\" (UID: \"7057d36e-e38b-41f9-98f1-7f136f859aec\") " Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.221747 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvf84\" (UniqueName: \"kubernetes.io/projected/677f677c-39b4-4713-afc0-57fb6b36a1a3-kube-api-access-bvf84\") pod \"677f677c-39b4-4713-afc0-57fb6b36a1a3\" (UID: \"677f677c-39b4-4713-afc0-57fb6b36a1a3\") " Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.221864 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7057d36e-e38b-41f9-98f1-7f136f859aec-config-volume\") pod \"7057d36e-e38b-41f9-98f1-7f136f859aec\" (UID: \"7057d36e-e38b-41f9-98f1-7f136f859aec\") " Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.221982 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7057d36e-e38b-41f9-98f1-7f136f859aec-secret-volume\") pod \"7057d36e-e38b-41f9-98f1-7f136f859aec\" (UID: \"7057d36e-e38b-41f9-98f1-7f136f859aec\") " Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.223216 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7057d36e-e38b-41f9-98f1-7f136f859aec-config-volume" (OuterVolumeSpecName: "config-volume") pod "7057d36e-e38b-41f9-98f1-7f136f859aec" (UID: "7057d36e-e38b-41f9-98f1-7f136f859aec"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.223650 4742 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7057d36e-e38b-41f9-98f1-7f136f859aec-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.226785 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677f677c-39b4-4713-afc0-57fb6b36a1a3-kube-api-access-bvf84" (OuterVolumeSpecName: "kube-api-access-bvf84") pod "677f677c-39b4-4713-afc0-57fb6b36a1a3" (UID: "677f677c-39b4-4713-afc0-57fb6b36a1a3"). InnerVolumeSpecName "kube-api-access-bvf84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.226966 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7057d36e-e38b-41f9-98f1-7f136f859aec-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7057d36e-e38b-41f9-98f1-7f136f859aec" (UID: "7057d36e-e38b-41f9-98f1-7f136f859aec"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.227100 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7057d36e-e38b-41f9-98f1-7f136f859aec-kube-api-access-bcjwf" (OuterVolumeSpecName: "kube-api-access-bcjwf") pod "7057d36e-e38b-41f9-98f1-7f136f859aec" (UID: "7057d36e-e38b-41f9-98f1-7f136f859aec"). InnerVolumeSpecName "kube-api-access-bcjwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.320775 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs"] Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.325346 4742 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7057d36e-e38b-41f9-98f1-7f136f859aec-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.325399 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcjwf\" (UniqueName: \"kubernetes.io/projected/7057d36e-e38b-41f9-98f1-7f136f859aec-kube-api-access-bcjwf\") on node \"crc\" DevicePath \"\"" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.325424 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvf84\" (UniqueName: \"kubernetes.io/projected/677f677c-39b4-4713-afc0-57fb6b36a1a3-kube-api-access-bvf84\") on node \"crc\" DevicePath \"\"" Mar 17 11:30:16 crc kubenswrapper[4742]: W0317 11:30:16.325753 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d6829e2_3788_4653_91e4_bff007a7bb5d.slice/crio-66cd583ba822a6df953986c9ef46ff9cf1de003aaffa15751c0439ba86696d98 WatchSource:0}: Error finding container 66cd583ba822a6df953986c9ef46ff9cf1de003aaffa15751c0439ba86696d98: Status 404 returned error can't find the container with id 66cd583ba822a6df953986c9ef46ff9cf1de003aaffa15751c0439ba86696d98 Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.766111 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562450-jzztm" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.766163 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562450-jzztm" event={"ID":"677f677c-39b4-4713-afc0-57fb6b36a1a3","Type":"ContainerDied","Data":"846430701c1b841d89eb0745403cd6e20ed1d8726bb3f9ee12abf6533e3ec22e"} Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.766214 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="846430701c1b841d89eb0745403cd6e20ed1d8726bb3f9ee12abf6533e3ec22e" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.767605 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" event={"ID":"7d6829e2-3788-4653-91e4-bff007a7bb5d","Type":"ContainerStarted","Data":"9c95bdbc5663e720997bcdd45577390365620cc9ccb7204ba51610b74653575a"} Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.767652 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" event={"ID":"7d6829e2-3788-4653-91e4-bff007a7bb5d","Type":"ContainerStarted","Data":"66cd583ba822a6df953986c9ef46ff9cf1de003aaffa15751c0439ba86696d98"} Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.767672 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.770189 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" event={"ID":"7057d36e-e38b-41f9-98f1-7f136f859aec","Type":"ContainerDied","Data":"4497c2f2c083d7249332f3d18b93aef00b201713775000d214dec04109b406c4"} Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.770213 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4497c2f2c083d7249332f3d18b93aef00b201713775000d214dec04109b406c4" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.770481 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v" Mar 17 11:30:16 crc kubenswrapper[4742]: I0317 11:30:16.824152 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" podStartSLOduration=33.824129348 podStartE2EDuration="33.824129348s" podCreationTimestamp="2026-03-17 11:29:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:30:16.821794146 +0000 UTC m=+1119.947921914" watchObservedRunningTime="2026-03-17 11:30:16.824129348 +0000 UTC m=+1119.950257116" Mar 17 11:30:17 crc kubenswrapper[4742]: I0317 11:30:17.147631 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562444-9mtj5"] Mar 17 11:30:17 crc kubenswrapper[4742]: I0317 11:30:17.155197 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562444-9mtj5"] Mar 17 11:30:18 crc kubenswrapper[4742]: I0317 11:30:18.682413 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="520a6146-8525-4d61-bbc0-8fe2c576b266" path="/var/lib/kubelet/pods/520a6146-8525-4d61-bbc0-8fe2c576b266/volumes" Mar 17 11:30:19 crc kubenswrapper[4742]: I0317 11:30:19.053070 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-njktv" Mar 17 11:30:19 crc kubenswrapper[4742]: I0317 11:30:19.093061 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-89w9s" Mar 17 11:30:23 crc kubenswrapper[4742]: I0317 11:30:23.426348 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-g252s" Mar 17 11:30:23 crc kubenswrapper[4742]: I0317 11:30:23.608709 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-rzpkl" Mar 17 11:30:23 crc kubenswrapper[4742]: I0317 11:30:23.791594 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-c9j2m" Mar 17 11:30:23 crc kubenswrapper[4742]: I0317 11:30:23.847100 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-fh8v8" Mar 17 11:30:23 crc kubenswrapper[4742]: I0317 11:30:23.852250 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-fhqr4" Mar 17 11:30:23 crc kubenswrapper[4742]: I0317 11:30:23.873129 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-rf6p4" Mar 17 11:30:25 crc kubenswrapper[4742]: I0317 11:30:25.781097 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-c748c4754-6hffs" Mar 17 11:30:40 crc kubenswrapper[4742]: I0317 11:30:40.274702 4742 scope.go:117] "RemoveContainer" containerID="6b21e189edc79bc7724b4a120234d0cc9adf3120f669a465905ed388680b1afe" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.763116 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mwpw9"] Mar 17 11:30:44 crc kubenswrapper[4742]: E0317 11:30:44.763899 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7057d36e-e38b-41f9-98f1-7f136f859aec" containerName="collect-profiles" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.763928 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="7057d36e-e38b-41f9-98f1-7f136f859aec" containerName="collect-profiles" Mar 17 11:30:44 crc kubenswrapper[4742]: E0317 11:30:44.763953 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677f677c-39b4-4713-afc0-57fb6b36a1a3" containerName="oc" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.763959 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="677f677c-39b4-4713-afc0-57fb6b36a1a3" containerName="oc" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.764110 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="677f677c-39b4-4713-afc0-57fb6b36a1a3" containerName="oc" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.764123 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="7057d36e-e38b-41f9-98f1-7f136f859aec" containerName="collect-profiles" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.764802 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mwpw9" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.768354 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-h92cz" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.768598 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.768818 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.769003 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.784386 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mwpw9"] Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.838683 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6wk79"] Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.839959 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.841740 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.845116 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6wk79"] Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.884126 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzks2\" (UniqueName: \"kubernetes.io/projected/0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b-kube-api-access-hzks2\") pod \"dnsmasq-dns-675f4bcbfc-mwpw9\" (UID: \"0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mwpw9" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.884173 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b-config\") pod \"dnsmasq-dns-675f4bcbfc-mwpw9\" (UID: \"0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mwpw9" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.985507 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2dvv\" (UniqueName: \"kubernetes.io/projected/34d43601-1d5a-48fd-b086-515bf4ad953e-kube-api-access-l2dvv\") pod \"dnsmasq-dns-78dd6ddcc-6wk79\" (UID: \"34d43601-1d5a-48fd-b086-515bf4ad953e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.985564 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d43601-1d5a-48fd-b086-515bf4ad953e-config\") pod \"dnsmasq-dns-78dd6ddcc-6wk79\" (UID: \"34d43601-1d5a-48fd-b086-515bf4ad953e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.985666 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d43601-1d5a-48fd-b086-515bf4ad953e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6wk79\" (UID: \"34d43601-1d5a-48fd-b086-515bf4ad953e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.985897 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzks2\" (UniqueName: \"kubernetes.io/projected/0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b-kube-api-access-hzks2\") pod \"dnsmasq-dns-675f4bcbfc-mwpw9\" (UID: \"0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mwpw9" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.986092 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b-config\") pod \"dnsmasq-dns-675f4bcbfc-mwpw9\" (UID: \"0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mwpw9" Mar 17 11:30:44 crc kubenswrapper[4742]: I0317 11:30:44.987182 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b-config\") pod \"dnsmasq-dns-675f4bcbfc-mwpw9\" (UID: \"0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mwpw9" Mar 17 11:30:45 crc kubenswrapper[4742]: I0317 11:30:45.009510 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzks2\" (UniqueName: \"kubernetes.io/projected/0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b-kube-api-access-hzks2\") pod \"dnsmasq-dns-675f4bcbfc-mwpw9\" (UID: \"0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mwpw9" Mar 17 11:30:45 crc kubenswrapper[4742]: I0317 11:30:45.088420 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2dvv\" (UniqueName: \"kubernetes.io/projected/34d43601-1d5a-48fd-b086-515bf4ad953e-kube-api-access-l2dvv\") pod \"dnsmasq-dns-78dd6ddcc-6wk79\" (UID: \"34d43601-1d5a-48fd-b086-515bf4ad953e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" Mar 17 11:30:45 crc kubenswrapper[4742]: I0317 11:30:45.088979 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d43601-1d5a-48fd-b086-515bf4ad953e-config\") pod \"dnsmasq-dns-78dd6ddcc-6wk79\" (UID: \"34d43601-1d5a-48fd-b086-515bf4ad953e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" Mar 17 11:30:45 crc kubenswrapper[4742]: I0317 11:30:45.089122 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d43601-1d5a-48fd-b086-515bf4ad953e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6wk79\" (UID: \"34d43601-1d5a-48fd-b086-515bf4ad953e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" Mar 17 11:30:45 crc kubenswrapper[4742]: I0317 11:30:45.090301 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d43601-1d5a-48fd-b086-515bf4ad953e-config\") pod \"dnsmasq-dns-78dd6ddcc-6wk79\" (UID: \"34d43601-1d5a-48fd-b086-515bf4ad953e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" Mar 17 11:30:45 crc kubenswrapper[4742]: I0317 11:30:45.090659 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d43601-1d5a-48fd-b086-515bf4ad953e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-6wk79\" (UID: \"34d43601-1d5a-48fd-b086-515bf4ad953e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" Mar 17 11:30:45 crc kubenswrapper[4742]: I0317 11:30:45.093076 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mwpw9" Mar 17 11:30:45 crc kubenswrapper[4742]: I0317 11:30:45.119089 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2dvv\" (UniqueName: \"kubernetes.io/projected/34d43601-1d5a-48fd-b086-515bf4ad953e-kube-api-access-l2dvv\") pod \"dnsmasq-dns-78dd6ddcc-6wk79\" (UID: \"34d43601-1d5a-48fd-b086-515bf4ad953e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" Mar 17 11:30:45 crc kubenswrapper[4742]: I0317 11:30:45.154175 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" Mar 17 11:30:45 crc kubenswrapper[4742]: I0317 11:30:45.541347 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mwpw9"] Mar 17 11:30:45 crc kubenswrapper[4742]: I0317 11:30:45.639249 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6wk79"] Mar 17 11:30:45 crc kubenswrapper[4742]: W0317 11:30:45.641749 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34d43601_1d5a_48fd_b086_515bf4ad953e.slice/crio-37b93efcb7d8e5f67dcc755ecf49673e17789c6dd5e851d75f1f6d7d618e982c WatchSource:0}: Error finding container 37b93efcb7d8e5f67dcc755ecf49673e17789c6dd5e851d75f1f6d7d618e982c: Status 404 returned error can't find the container with id 37b93efcb7d8e5f67dcc755ecf49673e17789c6dd5e851d75f1f6d7d618e982c Mar 17 11:30:46 crc kubenswrapper[4742]: I0317 11:30:46.013358 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" event={"ID":"34d43601-1d5a-48fd-b086-515bf4ad953e","Type":"ContainerStarted","Data":"37b93efcb7d8e5f67dcc755ecf49673e17789c6dd5e851d75f1f6d7d618e982c"} Mar 17 11:30:46 crc kubenswrapper[4742]: I0317 11:30:46.014140 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-mwpw9" event={"ID":"0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b","Type":"ContainerStarted","Data":"7c3b56594b7dd22805e9ff4bf1d5fb0a64fca748c0a2b0540ae292feb5ce5aa1"} Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.463343 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mwpw9"] Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.487566 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-4x7wf"] Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.488614 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.517980 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-4x7wf"] Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.540830 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbt9s\" (UniqueName: \"kubernetes.io/projected/434c361d-ee53-4862-86c8-a0eddb1ae902-kube-api-access-pbt9s\") pod \"dnsmasq-dns-5ccc8479f9-4x7wf\" (UID: \"434c361d-ee53-4862-86c8-a0eddb1ae902\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.540894 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/434c361d-ee53-4862-86c8-a0eddb1ae902-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-4x7wf\" (UID: \"434c361d-ee53-4862-86c8-a0eddb1ae902\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.541000 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/434c361d-ee53-4862-86c8-a0eddb1ae902-config\") pod \"dnsmasq-dns-5ccc8479f9-4x7wf\" (UID: \"434c361d-ee53-4862-86c8-a0eddb1ae902\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.641917 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/434c361d-ee53-4862-86c8-a0eddb1ae902-config\") pod \"dnsmasq-dns-5ccc8479f9-4x7wf\" (UID: \"434c361d-ee53-4862-86c8-a0eddb1ae902\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.641978 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbt9s\" (UniqueName: \"kubernetes.io/projected/434c361d-ee53-4862-86c8-a0eddb1ae902-kube-api-access-pbt9s\") pod \"dnsmasq-dns-5ccc8479f9-4x7wf\" (UID: \"434c361d-ee53-4862-86c8-a0eddb1ae902\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.642012 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/434c361d-ee53-4862-86c8-a0eddb1ae902-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-4x7wf\" (UID: \"434c361d-ee53-4862-86c8-a0eddb1ae902\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.642775 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/434c361d-ee53-4862-86c8-a0eddb1ae902-config\") pod \"dnsmasq-dns-5ccc8479f9-4x7wf\" (UID: \"434c361d-ee53-4862-86c8-a0eddb1ae902\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.643512 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/434c361d-ee53-4862-86c8-a0eddb1ae902-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-4x7wf\" (UID: \"434c361d-ee53-4862-86c8-a0eddb1ae902\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.663767 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbt9s\" (UniqueName: \"kubernetes.io/projected/434c361d-ee53-4862-86c8-a0eddb1ae902-kube-api-access-pbt9s\") pod \"dnsmasq-dns-5ccc8479f9-4x7wf\" (UID: \"434c361d-ee53-4862-86c8-a0eddb1ae902\") " pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.760109 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6wk79"] Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.781247 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l84hg"] Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.782985 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.791834 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l84hg"] Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.810609 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.944762 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3cecdb55-b664-4224-bad8-524bf97f879b-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-l84hg\" (UID: \"3cecdb55-b664-4224-bad8-524bf97f879b\") " pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.944795 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cecdb55-b664-4224-bad8-524bf97f879b-config\") pod \"dnsmasq-dns-57d769cc4f-l84hg\" (UID: \"3cecdb55-b664-4224-bad8-524bf97f879b\") " pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" Mar 17 11:30:47 crc kubenswrapper[4742]: I0317 11:30:47.944851 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96t68\" (UniqueName: \"kubernetes.io/projected/3cecdb55-b664-4224-bad8-524bf97f879b-kube-api-access-96t68\") pod \"dnsmasq-dns-57d769cc4f-l84hg\" (UID: \"3cecdb55-b664-4224-bad8-524bf97f879b\") " pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.046199 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3cecdb55-b664-4224-bad8-524bf97f879b-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-l84hg\" (UID: \"3cecdb55-b664-4224-bad8-524bf97f879b\") " pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.046230 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cecdb55-b664-4224-bad8-524bf97f879b-config\") pod \"dnsmasq-dns-57d769cc4f-l84hg\" (UID: \"3cecdb55-b664-4224-bad8-524bf97f879b\") " pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.046301 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96t68\" (UniqueName: \"kubernetes.io/projected/3cecdb55-b664-4224-bad8-524bf97f879b-kube-api-access-96t68\") pod \"dnsmasq-dns-57d769cc4f-l84hg\" (UID: \"3cecdb55-b664-4224-bad8-524bf97f879b\") " pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.047455 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cecdb55-b664-4224-bad8-524bf97f879b-config\") pod \"dnsmasq-dns-57d769cc4f-l84hg\" (UID: \"3cecdb55-b664-4224-bad8-524bf97f879b\") " pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.047573 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3cecdb55-b664-4224-bad8-524bf97f879b-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-l84hg\" (UID: \"3cecdb55-b664-4224-bad8-524bf97f879b\") " pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.068135 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96t68\" (UniqueName: \"kubernetes.io/projected/3cecdb55-b664-4224-bad8-524bf97f879b-kube-api-access-96t68\") pod \"dnsmasq-dns-57d769cc4f-l84hg\" (UID: \"3cecdb55-b664-4224-bad8-524bf97f879b\") " pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.104661 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.259169 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-4x7wf"] Mar 17 11:30:48 crc kubenswrapper[4742]: W0317 11:30:48.272927 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod434c361d_ee53_4862_86c8_a0eddb1ae902.slice/crio-58cde58720a1e96ea88081a646ba83b5685a343cc862d98a2131158f0ca39ce4 WatchSource:0}: Error finding container 58cde58720a1e96ea88081a646ba83b5685a343cc862d98a2131158f0ca39ce4: Status 404 returned error can't find the container with id 58cde58720a1e96ea88081a646ba83b5685a343cc862d98a2131158f0ca39ce4 Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.520291 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l84hg"] Mar 17 11:30:48 crc kubenswrapper[4742]: W0317 11:30:48.532579 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cecdb55_b664_4224_bad8_524bf97f879b.slice/crio-1b8b25ae34c93ed5025802661461f0e326549868c8b8a51fc69681aad26e4e5e WatchSource:0}: Error finding container 1b8b25ae34c93ed5025802661461f0e326549868c8b8a51fc69681aad26e4e5e: Status 404 returned error can't find the container with id 1b8b25ae34c93ed5025802661461f0e326549868c8b8a51fc69681aad26e4e5e Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.621063 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.622521 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.635149 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.635211 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.635240 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.635375 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.635460 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.635524 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2l6fw" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.635761 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.656447 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.758206 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.758260 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.758287 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.758306 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d71d306-a987-411e-82fe-e18450aa18a2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.758332 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.758355 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.758368 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d71d306-a987-411e-82fe-e18450aa18a2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.758397 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.758417 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.758431 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr5zf\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-kube-api-access-rr5zf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.758454 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.863006 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.863457 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.863479 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d71d306-a987-411e-82fe-e18450aa18a2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.863531 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.863586 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d71d306-a987-411e-82fe-e18450aa18a2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.863604 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.863666 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.863699 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.863719 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr5zf\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-kube-api-access-rr5zf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.863778 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.863799 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.863931 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.864462 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.873948 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.876425 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.876651 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.881318 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.881688 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d71d306-a987-411e-82fe-e18450aa18a2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.882564 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d71d306-a987-411e-82fe-e18450aa18a2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.883505 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.884758 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.907127 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr5zf\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-kube-api-access-rr5zf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.918505 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.920180 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.924470 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.924516 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.924480 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.924775 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.925162 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ls6t5" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.926298 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.926429 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.928706 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.930547 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 11:30:48 crc kubenswrapper[4742]: I0317 11:30:48.950956 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.064875 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" event={"ID":"434c361d-ee53-4862-86c8-a0eddb1ae902","Type":"ContainerStarted","Data":"58cde58720a1e96ea88081a646ba83b5685a343cc862d98a2131158f0ca39ce4"} Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.068445 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" event={"ID":"3cecdb55-b664-4224-bad8-524bf97f879b","Type":"ContainerStarted","Data":"1b8b25ae34c93ed5025802661461f0e326549868c8b8a51fc69681aad26e4e5e"} Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.068989 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.069045 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.069068 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-config-data\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.069243 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndv99\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-kube-api-access-ndv99\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.069278 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.069299 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.069448 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.069471 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.069541 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.069608 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.069632 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.171204 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-config-data\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.171249 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndv99\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-kube-api-access-ndv99\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.171298 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.172741 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.172783 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.172801 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.172839 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.172871 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.172898 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.172944 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.172969 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.175628 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.177042 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-config-data\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.183580 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.183748 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.184481 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.184744 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.186953 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.187086 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.187979 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.188797 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.209933 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndv99\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-kube-api-access-ndv99\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.213169 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.280022 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.445809 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 11:30:49 crc kubenswrapper[4742]: I0317 11:30:49.967245 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.032560 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.033715 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.033802 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.053832 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-5kpfc" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.054210 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.054613 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.054707 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.058941 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.188416 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.188485 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/91d27a2f-a471-4f90-aabb-9a021036805e-kolla-config\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.188511 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91d27a2f-a471-4f90-aabb-9a021036805e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.188561 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/91d27a2f-a471-4f90-aabb-9a021036805e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.188581 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d27a2f-a471-4f90-aabb-9a021036805e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.188596 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzzxw\" (UniqueName: \"kubernetes.io/projected/91d27a2f-a471-4f90-aabb-9a021036805e-kube-api-access-vzzxw\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.188613 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d27a2f-a471-4f90-aabb-9a021036805e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.188822 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/91d27a2f-a471-4f90-aabb-9a021036805e-config-data-default\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.290708 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/91d27a2f-a471-4f90-aabb-9a021036805e-kolla-config\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.290765 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91d27a2f-a471-4f90-aabb-9a021036805e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.290815 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/91d27a2f-a471-4f90-aabb-9a021036805e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.290833 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzzxw\" (UniqueName: \"kubernetes.io/projected/91d27a2f-a471-4f90-aabb-9a021036805e-kube-api-access-vzzxw\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.290848 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d27a2f-a471-4f90-aabb-9a021036805e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.290863 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d27a2f-a471-4f90-aabb-9a021036805e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.290889 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/91d27a2f-a471-4f90-aabb-9a021036805e-config-data-default\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.290933 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.292600 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/91d27a2f-a471-4f90-aabb-9a021036805e-kolla-config\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.293577 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91d27a2f-a471-4f90-aabb-9a021036805e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.293765 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/91d27a2f-a471-4f90-aabb-9a021036805e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.297851 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.298520 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/91d27a2f-a471-4f90-aabb-9a021036805e-config-data-default\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.316588 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d27a2f-a471-4f90-aabb-9a021036805e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.320549 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/91d27a2f-a471-4f90-aabb-9a021036805e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.329038 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.336234 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzzxw\" (UniqueName: \"kubernetes.io/projected/91d27a2f-a471-4f90-aabb-9a021036805e-kube-api-access-vzzxw\") pod \"openstack-galera-0\" (UID: \"91d27a2f-a471-4f90-aabb-9a021036805e\") " pod="openstack/openstack-galera-0" Mar 17 11:30:50 crc kubenswrapper[4742]: I0317 11:30:50.374065 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.296615 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.298805 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.301480 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-8zbnc" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.301642 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.301751 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.302357 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.329579 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.411852 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.411931 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.411985 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.412005 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.412045 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.412068 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.412099 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvfcx\" (UniqueName: \"kubernetes.io/projected/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-kube-api-access-fvfcx\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.412153 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.516622 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.516664 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.516696 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvfcx\" (UniqueName: \"kubernetes.io/projected/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-kube-api-access-fvfcx\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.516784 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.516825 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.516853 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.516962 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.516985 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.517810 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.518040 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.521564 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.522193 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.523888 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.541519 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.541519 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.547605 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvfcx\" (UniqueName: \"kubernetes.io/projected/ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5-kube-api-access-fvfcx\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.553142 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.554531 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.556029 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5\") " pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.558035 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.558246 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.558350 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-nvdfd" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.578646 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.617769 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.720496 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8qql\" (UniqueName: \"kubernetes.io/projected/5cbf7636-aea9-4186-be9f-a4b25776158c-kube-api-access-p8qql\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.720621 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cbf7636-aea9-4186-be9f-a4b25776158c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.720666 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5cbf7636-aea9-4186-be9f-a4b25776158c-kolla-config\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.720685 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cbf7636-aea9-4186-be9f-a4b25776158c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.720745 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cbf7636-aea9-4186-be9f-a4b25776158c-config-data\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.822556 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cbf7636-aea9-4186-be9f-a4b25776158c-config-data\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.822619 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8qql\" (UniqueName: \"kubernetes.io/projected/5cbf7636-aea9-4186-be9f-a4b25776158c-kube-api-access-p8qql\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.822665 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cbf7636-aea9-4186-be9f-a4b25776158c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.822705 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5cbf7636-aea9-4186-be9f-a4b25776158c-kolla-config\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.822724 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cbf7636-aea9-4186-be9f-a4b25776158c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.826483 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5cbf7636-aea9-4186-be9f-a4b25776158c-kolla-config\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.829100 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cbf7636-aea9-4186-be9f-a4b25776158c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.829282 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cbf7636-aea9-4186-be9f-a4b25776158c-config-data\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.833992 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cbf7636-aea9-4186-be9f-a4b25776158c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.839551 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8qql\" (UniqueName: \"kubernetes.io/projected/5cbf7636-aea9-4186-be9f-a4b25776158c-kube-api-access-p8qql\") pod \"memcached-0\" (UID: \"5cbf7636-aea9-4186-be9f-a4b25776158c\") " pod="openstack/memcached-0" Mar 17 11:30:51 crc kubenswrapper[4742]: I0317 11:30:51.918318 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 17 11:30:53 crc kubenswrapper[4742]: I0317 11:30:53.722570 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 11:30:53 crc kubenswrapper[4742]: I0317 11:30:53.724654 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 11:30:53 crc kubenswrapper[4742]: I0317 11:30:53.729700 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 11:30:53 crc kubenswrapper[4742]: I0317 11:30:53.738646 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-xwgs2" Mar 17 11:30:53 crc kubenswrapper[4742]: I0317 11:30:53.854857 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5982x\" (UniqueName: \"kubernetes.io/projected/b4dcc55d-79fc-492b-980b-527f9a71a89c-kube-api-access-5982x\") pod \"kube-state-metrics-0\" (UID: \"b4dcc55d-79fc-492b-980b-527f9a71a89c\") " pod="openstack/kube-state-metrics-0" Mar 17 11:30:53 crc kubenswrapper[4742]: I0317 11:30:53.956540 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5982x\" (UniqueName: \"kubernetes.io/projected/b4dcc55d-79fc-492b-980b-527f9a71a89c-kube-api-access-5982x\") pod \"kube-state-metrics-0\" (UID: \"b4dcc55d-79fc-492b-980b-527f9a71a89c\") " pod="openstack/kube-state-metrics-0" Mar 17 11:30:53 crc kubenswrapper[4742]: I0317 11:30:53.978974 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5982x\" (UniqueName: \"kubernetes.io/projected/b4dcc55d-79fc-492b-980b-527f9a71a89c-kube-api-access-5982x\") pod \"kube-state-metrics-0\" (UID: \"b4dcc55d-79fc-492b-980b-527f9a71a89c\") " pod="openstack/kube-state-metrics-0" Mar 17 11:30:54 crc kubenswrapper[4742]: I0317 11:30:54.048782 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 11:30:54 crc kubenswrapper[4742]: I0317 11:30:54.135604 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6","Type":"ContainerStarted","Data":"8a77c3656f9054dd75e53300541f1f19547cc5f8d1cd2c159960b51c828a299a"} Mar 17 11:30:54 crc kubenswrapper[4742]: I0317 11:30:54.137079 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d71d306-a987-411e-82fe-e18450aa18a2","Type":"ContainerStarted","Data":"c1aed7c500e49f2ba7a5ee494e2df33faded14c193f7d81894b2b827b90ee903"} Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.825878 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4j5jz"] Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.827455 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.838528 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4j5jz"] Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.839739 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.840114 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-25r8j" Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.840693 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.897041 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/158a0d7f-e22f-4f44-aca2-efb59ff90439-var-log-ovn\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.897094 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/158a0d7f-e22f-4f44-aca2-efb59ff90439-scripts\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.897116 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/158a0d7f-e22f-4f44-aca2-efb59ff90439-var-run-ovn\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.897138 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158a0d7f-e22f-4f44-aca2-efb59ff90439-combined-ca-bundle\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.897222 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/158a0d7f-e22f-4f44-aca2-efb59ff90439-var-run\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.897266 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzzf2\" (UniqueName: \"kubernetes.io/projected/158a0d7f-e22f-4f44-aca2-efb59ff90439-kube-api-access-rzzf2\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.897296 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/158a0d7f-e22f-4f44-aca2-efb59ff90439-ovn-controller-tls-certs\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.903166 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-dmqzv"] Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.914430 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dmqzv"] Mar 17 11:30:56 crc kubenswrapper[4742]: I0317 11:30:56.914533 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.998800 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/158a0d7f-e22f-4f44-aca2-efb59ff90439-var-log-ovn\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.998840 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/158a0d7f-e22f-4f44-aca2-efb59ff90439-scripts\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.998879 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84h86\" (UniqueName: \"kubernetes.io/projected/dd5cf259-c4bf-44cf-b101-bcc78c153852-kube-api-access-84h86\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.998897 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/158a0d7f-e22f-4f44-aca2-efb59ff90439-var-run-ovn\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.998928 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158a0d7f-e22f-4f44-aca2-efb59ff90439-combined-ca-bundle\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.999405 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/158a0d7f-e22f-4f44-aca2-efb59ff90439-var-run-ovn\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.999463 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dd5cf259-c4bf-44cf-b101-bcc78c153852-var-run\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.999487 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dd5cf259-c4bf-44cf-b101-bcc78c153852-var-lib\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.999504 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/158a0d7f-e22f-4f44-aca2-efb59ff90439-var-log-ovn\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.999563 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/158a0d7f-e22f-4f44-aca2-efb59ff90439-var-run\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.999635 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzzf2\" (UniqueName: \"kubernetes.io/projected/158a0d7f-e22f-4f44-aca2-efb59ff90439-kube-api-access-rzzf2\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.999690 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/158a0d7f-e22f-4f44-aca2-efb59ff90439-ovn-controller-tls-certs\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.999716 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dd5cf259-c4bf-44cf-b101-bcc78c153852-etc-ovs\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:56.999972 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/158a0d7f-e22f-4f44-aca2-efb59ff90439-var-run\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.000071 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dd5cf259-c4bf-44cf-b101-bcc78c153852-var-log\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.000098 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd5cf259-c4bf-44cf-b101-bcc78c153852-scripts\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.002147 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/158a0d7f-e22f-4f44-aca2-efb59ff90439-scripts\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.005544 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/158a0d7f-e22f-4f44-aca2-efb59ff90439-ovn-controller-tls-certs\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.006142 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/158a0d7f-e22f-4f44-aca2-efb59ff90439-combined-ca-bundle\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.017961 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzzf2\" (UniqueName: \"kubernetes.io/projected/158a0d7f-e22f-4f44-aca2-efb59ff90439-kube-api-access-rzzf2\") pod \"ovn-controller-4j5jz\" (UID: \"158a0d7f-e22f-4f44-aca2-efb59ff90439\") " pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.101143 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dd5cf259-c4bf-44cf-b101-bcc78c153852-etc-ovs\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.101188 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dd5cf259-c4bf-44cf-b101-bcc78c153852-var-log\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.101207 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd5cf259-c4bf-44cf-b101-bcc78c153852-scripts\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.101244 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84h86\" (UniqueName: \"kubernetes.io/projected/dd5cf259-c4bf-44cf-b101-bcc78c153852-kube-api-access-84h86\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.101269 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dd5cf259-c4bf-44cf-b101-bcc78c153852-var-run\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.101288 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dd5cf259-c4bf-44cf-b101-bcc78c153852-var-lib\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.101554 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dd5cf259-c4bf-44cf-b101-bcc78c153852-var-lib\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.101670 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dd5cf259-c4bf-44cf-b101-bcc78c153852-etc-ovs\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.101747 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dd5cf259-c4bf-44cf-b101-bcc78c153852-var-log\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.103467 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd5cf259-c4bf-44cf-b101-bcc78c153852-scripts\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.103758 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dd5cf259-c4bf-44cf-b101-bcc78c153852-var-run\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.123449 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84h86\" (UniqueName: \"kubernetes.io/projected/dd5cf259-c4bf-44cf-b101-bcc78c153852-kube-api-access-84h86\") pod \"ovn-controller-ovs-dmqzv\" (UID: \"dd5cf259-c4bf-44cf-b101-bcc78c153852\") " pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.151361 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4j5jz" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.203816 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.205227 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.207492 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.207818 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.207970 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.208112 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.215759 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-l7422" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.218167 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.242683 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.304724 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4f3d5f-526a-4163-8dbb-a019050a0e03-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.304765 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a4f3d5f-526a-4163-8dbb-a019050a0e03-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.304789 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9wn8\" (UniqueName: \"kubernetes.io/projected/7a4f3d5f-526a-4163-8dbb-a019050a0e03-kube-api-access-l9wn8\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.304811 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.304863 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4f3d5f-526a-4163-8dbb-a019050a0e03-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.304880 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4f3d5f-526a-4163-8dbb-a019050a0e03-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.304918 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a4f3d5f-526a-4163-8dbb-a019050a0e03-config\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.304934 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a4f3d5f-526a-4163-8dbb-a019050a0e03-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.406583 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4f3d5f-526a-4163-8dbb-a019050a0e03-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.406629 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a4f3d5f-526a-4163-8dbb-a019050a0e03-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.406653 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9wn8\" (UniqueName: \"kubernetes.io/projected/7a4f3d5f-526a-4163-8dbb-a019050a0e03-kube-api-access-l9wn8\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.406706 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.406766 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4f3d5f-526a-4163-8dbb-a019050a0e03-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.406782 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4f3d5f-526a-4163-8dbb-a019050a0e03-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.406812 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a4f3d5f-526a-4163-8dbb-a019050a0e03-config\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.406832 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a4f3d5f-526a-4163-8dbb-a019050a0e03-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.407164 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7a4f3d5f-526a-4163-8dbb-a019050a0e03-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.407353 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.408017 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a4f3d5f-526a-4163-8dbb-a019050a0e03-config\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.408046 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a4f3d5f-526a-4163-8dbb-a019050a0e03-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.411764 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4f3d5f-526a-4163-8dbb-a019050a0e03-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.414241 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4f3d5f-526a-4163-8dbb-a019050a0e03-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.420779 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a4f3d5f-526a-4163-8dbb-a019050a0e03-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.421884 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9wn8\" (UniqueName: \"kubernetes.io/projected/7a4f3d5f-526a-4163-8dbb-a019050a0e03-kube-api-access-l9wn8\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.433497 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"7a4f3d5f-526a-4163-8dbb-a019050a0e03\") " pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:57 crc kubenswrapper[4742]: I0317 11:30:57.519209 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 17 11:30:58 crc kubenswrapper[4742]: I0317 11:30:58.970272 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.251344 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.252730 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.254406 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.254602 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-kxwnl" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.254767 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.261214 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.262696 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.368311 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.368368 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.368396 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.368442 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.368459 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.368475 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2fl6\" (UniqueName: \"kubernetes.io/projected/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-kube-api-access-f2fl6\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.369001 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.369202 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-config\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.470804 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.470858 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-config\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.470899 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.470955 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.470981 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.471003 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.471021 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.471036 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2fl6\" (UniqueName: \"kubernetes.io/projected/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-kube-api-access-f2fl6\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.471343 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.471601 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.471862 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-config\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.473050 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.477125 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.478133 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.478246 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.488438 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2fl6\" (UniqueName: \"kubernetes.io/projected/3bba6aef-f8ff-436a-b3c1-97fbe9819ff1-kube-api-access-f2fl6\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.499044 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1\") " pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:01 crc kubenswrapper[4742]: I0317 11:31:01.579547 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:08 crc kubenswrapper[4742]: W0317 11:31:08.686348 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4dcc55d_79fc_492b_980b_527f9a71a89c.slice/crio-63e1b818eeff3e0e6e3862a10daa3ff995376df9c6b852294c4d9665668972d7 WatchSource:0}: Error finding container 63e1b818eeff3e0e6e3862a10daa3ff995376df9c6b852294c4d9665668972d7: Status 404 returned error can't find the container with id 63e1b818eeff3e0e6e3862a10daa3ff995376df9c6b852294c4d9665668972d7 Mar 17 11:31:09 crc kubenswrapper[4742]: I0317 11:31:09.264696 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b4dcc55d-79fc-492b-980b-527f9a71a89c","Type":"ContainerStarted","Data":"63e1b818eeff3e0e6e3862a10daa3ff995376df9c6b852294c4d9665668972d7"} Mar 17 11:31:09 crc kubenswrapper[4742]: E0317 11:31:09.881595 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 17 11:31:09 crc kubenswrapper[4742]: E0317 11:31:09.881779 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rr5zf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(0d71d306-a987-411e-82fe-e18450aa18a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 11:31:09 crc kubenswrapper[4742]: E0317 11:31:09.882966 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0d71d306-a987-411e-82fe-e18450aa18a2" Mar 17 11:31:09 crc kubenswrapper[4742]: E0317 11:31:09.899316 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 17 11:31:09 crc kubenswrapper[4742]: E0317 11:31:09.899531 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndv99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 11:31:09 crc kubenswrapper[4742]: E0317 11:31:09.900806 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.273674 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0d71d306-a987-411e-82fe-e18450aa18a2" Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.273809 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" Mar 17 11:31:10 crc kubenswrapper[4742]: I0317 11:31:10.504010 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.916408 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.917793 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l2dvv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-6wk79_openstack(34d43601-1d5a-48fd-b086-515bf4ad953e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.918917 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" podUID="34d43601-1d5a-48fd-b086-515bf4ad953e" Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.935720 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.935883 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hzks2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-mwpw9_openstack(0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.938141 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-mwpw9" podUID="0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b" Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.952445 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.952591 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pbt9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-4x7wf_openstack(434c361d-ee53-4862-86c8-a0eddb1ae902): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.957175 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" podUID="434c361d-ee53-4862-86c8-a0eddb1ae902" Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.998398 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.998548 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-96t68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-l84hg_openstack(3cecdb55-b664-4224-bad8-524bf97f879b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 11:31:10 crc kubenswrapper[4742]: E0317 11:31:10.999932 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" podUID="3cecdb55-b664-4224-bad8-524bf97f879b" Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.280037 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7a4f3d5f-526a-4163-8dbb-a019050a0e03","Type":"ContainerStarted","Data":"a3811e276e42a946347890253a575b4dd3d5083d6ff7cd0f4f69a4dcc35131f4"} Mar 17 11:31:11 crc kubenswrapper[4742]: E0317 11:31:11.281687 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" podUID="434c361d-ee53-4862-86c8-a0eddb1ae902" Mar 17 11:31:11 crc kubenswrapper[4742]: E0317 11:31:11.281973 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" podUID="3cecdb55-b664-4224-bad8-524bf97f879b" Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.401990 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.407234 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.509795 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.612024 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4j5jz"] Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.618991 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-dmqzv"] Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.700096 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 17 11:31:11 crc kubenswrapper[4742]: W0317 11:31:11.794822 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bba6aef_f8ff_436a_b3c1_97fbe9819ff1.slice/crio-4db54d06b6b9da307ffbe6e9b44328c7342b44e8b08ee144e1200738cc542ee8 WatchSource:0}: Error finding container 4db54d06b6b9da307ffbe6e9b44328c7342b44e8b08ee144e1200738cc542ee8: Status 404 returned error can't find the container with id 4db54d06b6b9da307ffbe6e9b44328c7342b44e8b08ee144e1200738cc542ee8 Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.834196 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mwpw9" Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.840319 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.951569 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2dvv\" (UniqueName: \"kubernetes.io/projected/34d43601-1d5a-48fd-b086-515bf4ad953e-kube-api-access-l2dvv\") pod \"34d43601-1d5a-48fd-b086-515bf4ad953e\" (UID: \"34d43601-1d5a-48fd-b086-515bf4ad953e\") " Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.951631 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d43601-1d5a-48fd-b086-515bf4ad953e-dns-svc\") pod \"34d43601-1d5a-48fd-b086-515bf4ad953e\" (UID: \"34d43601-1d5a-48fd-b086-515bf4ad953e\") " Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.951709 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d43601-1d5a-48fd-b086-515bf4ad953e-config\") pod \"34d43601-1d5a-48fd-b086-515bf4ad953e\" (UID: \"34d43601-1d5a-48fd-b086-515bf4ad953e\") " Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.951738 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzks2\" (UniqueName: \"kubernetes.io/projected/0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b-kube-api-access-hzks2\") pod \"0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b\" (UID: \"0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b\") " Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.951872 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b-config\") pod \"0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b\" (UID: \"0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b\") " Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.952836 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b-config" (OuterVolumeSpecName: "config") pod "0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b" (UID: "0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.953359 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34d43601-1d5a-48fd-b086-515bf4ad953e-config" (OuterVolumeSpecName: "config") pod "34d43601-1d5a-48fd-b086-515bf4ad953e" (UID: "34d43601-1d5a-48fd-b086-515bf4ad953e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.953556 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34d43601-1d5a-48fd-b086-515bf4ad953e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34d43601-1d5a-48fd-b086-515bf4ad953e" (UID: "34d43601-1d5a-48fd-b086-515bf4ad953e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.957375 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b-kube-api-access-hzks2" (OuterVolumeSpecName: "kube-api-access-hzks2") pod "0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b" (UID: "0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b"). InnerVolumeSpecName "kube-api-access-hzks2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:11 crc kubenswrapper[4742]: I0317 11:31:11.958313 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d43601-1d5a-48fd-b086-515bf4ad953e-kube-api-access-l2dvv" (OuterVolumeSpecName: "kube-api-access-l2dvv") pod "34d43601-1d5a-48fd-b086-515bf4ad953e" (UID: "34d43601-1d5a-48fd-b086-515bf4ad953e"). InnerVolumeSpecName "kube-api-access-l2dvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.069946 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2dvv\" (UniqueName: \"kubernetes.io/projected/34d43601-1d5a-48fd-b086-515bf4ad953e-kube-api-access-l2dvv\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.069984 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d43601-1d5a-48fd-b086-515bf4ad953e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.069995 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d43601-1d5a-48fd-b086-515bf4ad953e-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.070013 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzks2\" (UniqueName: \"kubernetes.io/projected/0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b-kube-api-access-hzks2\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.070028 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.290696 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5cbf7636-aea9-4186-be9f-a4b25776158c","Type":"ContainerStarted","Data":"851dc06b0af13924e89025b442e22972141cb813cd38c4e707a97b5eab99dfa1"} Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.291666 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5","Type":"ContainerStarted","Data":"61559d9ae355922e54da1f7aa613bd9d58ac207826168cafd523236dfd8c8c23"} Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.292835 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"91d27a2f-a471-4f90-aabb-9a021036805e","Type":"ContainerStarted","Data":"64058bde6550c0c46f51c5913bf5676c55037fdd11efb34e0b0419a5fee25f37"} Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.293661 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4j5jz" event={"ID":"158a0d7f-e22f-4f44-aca2-efb59ff90439","Type":"ContainerStarted","Data":"4278f0cecf554cf525663768acc6507ed5c672e1468817bc58a605015b683d62"} Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.294713 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-mwpw9" event={"ID":"0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b","Type":"ContainerDied","Data":"7c3b56594b7dd22805e9ff4bf1d5fb0a64fca748c0a2b0540ae292feb5ce5aa1"} Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.294748 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mwpw9" Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.295980 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dmqzv" event={"ID":"dd5cf259-c4bf-44cf-b101-bcc78c153852","Type":"ContainerStarted","Data":"7de83e3e9c7270fac660a4128a2180c46a869b3c914b12f79082fd81260cea28"} Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.298107 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1","Type":"ContainerStarted","Data":"4db54d06b6b9da307ffbe6e9b44328c7342b44e8b08ee144e1200738cc542ee8"} Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.299814 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" event={"ID":"34d43601-1d5a-48fd-b086-515bf4ad953e","Type":"ContainerDied","Data":"37b93efcb7d8e5f67dcc755ecf49673e17789c6dd5e851d75f1f6d7d618e982c"} Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.299873 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-6wk79" Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.377138 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mwpw9"] Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.388044 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mwpw9"] Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.411365 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6wk79"] Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.412397 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-6wk79"] Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.674561 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b" path="/var/lib/kubelet/pods/0ba8e0bb-ae78-4e62-9609-0eebc8dfb87b/volumes" Mar 17 11:31:12 crc kubenswrapper[4742]: I0317 11:31:12.675062 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34d43601-1d5a-48fd-b086-515bf4ad953e" path="/var/lib/kubelet/pods/34d43601-1d5a-48fd-b086-515bf4ad953e/volumes" Mar 17 11:31:13 crc kubenswrapper[4742]: I0317 11:31:13.308904 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b4dcc55d-79fc-492b-980b-527f9a71a89c","Type":"ContainerStarted","Data":"341eb69fcfb599dc7de91fa2492483dfbc6fa83f21a65d9c08516c60f0202081"} Mar 17 11:31:13 crc kubenswrapper[4742]: I0317 11:31:13.309319 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 17 11:31:13 crc kubenswrapper[4742]: I0317 11:31:13.324313 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=15.990252106 podStartE2EDuration="20.324294746s" podCreationTimestamp="2026-03-17 11:30:53 +0000 UTC" firstStartedPulling="2026-03-17 11:31:08.691941088 +0000 UTC m=+1171.818068846" lastFinishedPulling="2026-03-17 11:31:13.025983728 +0000 UTC m=+1176.152111486" observedRunningTime="2026-03-17 11:31:13.320945268 +0000 UTC m=+1176.447073026" watchObservedRunningTime="2026-03-17 11:31:13.324294746 +0000 UTC m=+1176.450422504" Mar 17 11:31:18 crc kubenswrapper[4742]: I0317 11:31:18.043943 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:31:18 crc kubenswrapper[4742]: I0317 11:31:18.044480 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:31:19 crc kubenswrapper[4742]: I0317 11:31:19.357697 4742 generic.go:334] "Generic (PLEG): container finished" podID="dd5cf259-c4bf-44cf-b101-bcc78c153852" containerID="1ca6eb5167227900af29d77eabd9e5a94f1ffe9e3db293cf10e007fe98838175" exitCode=0 Mar 17 11:31:19 crc kubenswrapper[4742]: I0317 11:31:19.357769 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dmqzv" event={"ID":"dd5cf259-c4bf-44cf-b101-bcc78c153852","Type":"ContainerDied","Data":"1ca6eb5167227900af29d77eabd9e5a94f1ffe9e3db293cf10e007fe98838175"} Mar 17 11:31:19 crc kubenswrapper[4742]: I0317 11:31:19.360170 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1","Type":"ContainerStarted","Data":"54e8e27422c847e17abfab27dc9e9f8a9ddb8e242a8b099be6613cf55dcabdac"} Mar 17 11:31:19 crc kubenswrapper[4742]: I0317 11:31:19.364081 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5cbf7636-aea9-4186-be9f-a4b25776158c","Type":"ContainerStarted","Data":"7ee9c3fc7db06e48a87bbe17b48e39844fb55de724b90a0023647b861301af83"} Mar 17 11:31:19 crc kubenswrapper[4742]: I0317 11:31:19.364230 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 17 11:31:19 crc kubenswrapper[4742]: I0317 11:31:19.365738 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5","Type":"ContainerStarted","Data":"b2f9ba6d53be997c7301d556cb1e080201bed774484e712e953d8aae118a7da6"} Mar 17 11:31:19 crc kubenswrapper[4742]: I0317 11:31:19.368803 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"91d27a2f-a471-4f90-aabb-9a021036805e","Type":"ContainerStarted","Data":"aa86fb2731189fb4a3d8265f42066af231190eefb461decfce68442e25f10b14"} Mar 17 11:31:19 crc kubenswrapper[4742]: I0317 11:31:19.370641 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7a4f3d5f-526a-4163-8dbb-a019050a0e03","Type":"ContainerStarted","Data":"b3f5b73181ef69517783ecae22403cb49df1da20ae60385aade32b6ef2b6a713"} Mar 17 11:31:19 crc kubenswrapper[4742]: I0317 11:31:19.374169 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4j5jz" event={"ID":"158a0d7f-e22f-4f44-aca2-efb59ff90439","Type":"ContainerStarted","Data":"146daad1f74088a36c983eecd75d6095a0aba986f4cfe6ee64ef329ec7c5033f"} Mar 17 11:31:19 crc kubenswrapper[4742]: I0317 11:31:19.374779 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-4j5jz" Mar 17 11:31:19 crc kubenswrapper[4742]: I0317 11:31:19.415084 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-4j5jz" podStartSLOduration=17.001216221 podStartE2EDuration="23.415071274s" podCreationTimestamp="2026-03-17 11:30:56 +0000 UTC" firstStartedPulling="2026-03-17 11:31:11.807288383 +0000 UTC m=+1174.933416151" lastFinishedPulling="2026-03-17 11:31:18.221143446 +0000 UTC m=+1181.347271204" observedRunningTime="2026-03-17 11:31:19.410297217 +0000 UTC m=+1182.536424975" watchObservedRunningTime="2026-03-17 11:31:19.415071274 +0000 UTC m=+1182.541199032" Mar 17 11:31:19 crc kubenswrapper[4742]: I0317 11:31:19.457074 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=22.069111704 podStartE2EDuration="28.457056289s" podCreationTimestamp="2026-03-17 11:30:51 +0000 UTC" firstStartedPulling="2026-03-17 11:31:11.791211436 +0000 UTC m=+1174.917339194" lastFinishedPulling="2026-03-17 11:31:18.179156021 +0000 UTC m=+1181.305283779" observedRunningTime="2026-03-17 11:31:19.449617151 +0000 UTC m=+1182.575744909" watchObservedRunningTime="2026-03-17 11:31:19.457056289 +0000 UTC m=+1182.583184047" Mar 17 11:31:20 crc kubenswrapper[4742]: I0317 11:31:20.383344 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dmqzv" event={"ID":"dd5cf259-c4bf-44cf-b101-bcc78c153852","Type":"ContainerStarted","Data":"d5bdcf4cf5a28b10e3b877ffeee99200f6366791324c727d7b1b2547cdfba563"} Mar 17 11:31:20 crc kubenswrapper[4742]: I0317 11:31:20.383652 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-dmqzv" event={"ID":"dd5cf259-c4bf-44cf-b101-bcc78c153852","Type":"ContainerStarted","Data":"f80a0c6447bb76a6e19b4d207d15daa3333ecd00b9767dcb797a3dec6521fd00"} Mar 17 11:31:20 crc kubenswrapper[4742]: I0317 11:31:20.411364 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-dmqzv" podStartSLOduration=18.008163057 podStartE2EDuration="24.411346156s" podCreationTimestamp="2026-03-17 11:30:56 +0000 UTC" firstStartedPulling="2026-03-17 11:31:11.794513234 +0000 UTC m=+1174.920640992" lastFinishedPulling="2026-03-17 11:31:18.197696323 +0000 UTC m=+1181.323824091" observedRunningTime="2026-03-17 11:31:20.403278452 +0000 UTC m=+1183.529406210" watchObservedRunningTime="2026-03-17 11:31:20.411346156 +0000 UTC m=+1183.537473914" Mar 17 11:31:21 crc kubenswrapper[4742]: I0317 11:31:21.392516 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:31:21 crc kubenswrapper[4742]: I0317 11:31:21.392564 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:31:22 crc kubenswrapper[4742]: I0317 11:31:22.409003 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3bba6aef-f8ff-436a-b3c1-97fbe9819ff1","Type":"ContainerStarted","Data":"de11e1072112d60ed98ddac26ddef75321a0b7491987476f42544ee1ff99dc27"} Mar 17 11:31:22 crc kubenswrapper[4742]: I0317 11:31:22.412890 4742 generic.go:334] "Generic (PLEG): container finished" podID="ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5" containerID="b2f9ba6d53be997c7301d556cb1e080201bed774484e712e953d8aae118a7da6" exitCode=0 Mar 17 11:31:22 crc kubenswrapper[4742]: I0317 11:31:22.412967 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5","Type":"ContainerDied","Data":"b2f9ba6d53be997c7301d556cb1e080201bed774484e712e953d8aae118a7da6"} Mar 17 11:31:22 crc kubenswrapper[4742]: I0317 11:31:22.416997 4742 generic.go:334] "Generic (PLEG): container finished" podID="91d27a2f-a471-4f90-aabb-9a021036805e" containerID="aa86fb2731189fb4a3d8265f42066af231190eefb461decfce68442e25f10b14" exitCode=0 Mar 17 11:31:22 crc kubenswrapper[4742]: I0317 11:31:22.417079 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"91d27a2f-a471-4f90-aabb-9a021036805e","Type":"ContainerDied","Data":"aa86fb2731189fb4a3d8265f42066af231190eefb461decfce68442e25f10b14"} Mar 17 11:31:22 crc kubenswrapper[4742]: I0317 11:31:22.422178 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7a4f3d5f-526a-4163-8dbb-a019050a0e03","Type":"ContainerStarted","Data":"438fd51ac267f0ac0c4f6f4be5d812b167080c056234d557f3cee257688cec72"} Mar 17 11:31:22 crc kubenswrapper[4742]: I0317 11:31:22.440819 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.094553847 podStartE2EDuration="22.440789131s" podCreationTimestamp="2026-03-17 11:31:00 +0000 UTC" firstStartedPulling="2026-03-17 11:31:11.800621576 +0000 UTC m=+1174.926749334" lastFinishedPulling="2026-03-17 11:31:22.14685685 +0000 UTC m=+1185.272984618" observedRunningTime="2026-03-17 11:31:22.435413778 +0000 UTC m=+1185.561541576" watchObservedRunningTime="2026-03-17 11:31:22.440789131 +0000 UTC m=+1185.566916919" Mar 17 11:31:22 crc kubenswrapper[4742]: I0317 11:31:22.481116 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=15.256723139 podStartE2EDuration="26.481095s" podCreationTimestamp="2026-03-17 11:30:56 +0000 UTC" firstStartedPulling="2026-03-17 11:31:10.918158264 +0000 UTC m=+1174.044286022" lastFinishedPulling="2026-03-17 11:31:22.142530115 +0000 UTC m=+1185.268657883" observedRunningTime="2026-03-17 11:31:22.464546242 +0000 UTC m=+1185.590674030" watchObservedRunningTime="2026-03-17 11:31:22.481095 +0000 UTC m=+1185.607222748" Mar 17 11:31:22 crc kubenswrapper[4742]: I0317 11:31:22.520042 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 17 11:31:22 crc kubenswrapper[4742]: I0317 11:31:22.580430 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:22 crc kubenswrapper[4742]: I0317 11:31:22.623732 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.434150 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5","Type":"ContainerStarted","Data":"34f5a4f88e0ac0743f179e9f5fb7bebbf914d39b26a3bba9ecd1580e2aa0735e"} Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.436733 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" event={"ID":"434c361d-ee53-4862-86c8-a0eddb1ae902","Type":"ContainerStarted","Data":"06786f0c491a80a2ad68941841042fb59d99cf6de74f120ff48d89c5cd8fb767"} Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.438812 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"91d27a2f-a471-4f90-aabb-9a021036805e","Type":"ContainerStarted","Data":"d0c8f54b16c175d1011e17f3976b9df283c30adf1d6285b9fc6f23acd42a136d"} Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.439221 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.452914 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=27.04689333 podStartE2EDuration="33.452886253s" podCreationTimestamp="2026-03-17 11:30:50 +0000 UTC" firstStartedPulling="2026-03-17 11:31:11.791401682 +0000 UTC m=+1174.917529450" lastFinishedPulling="2026-03-17 11:31:18.197394565 +0000 UTC m=+1181.323522373" observedRunningTime="2026-03-17 11:31:23.451183088 +0000 UTC m=+1186.577310846" watchObservedRunningTime="2026-03-17 11:31:23.452886253 +0000 UTC m=+1186.579014011" Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.490458 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=29.063300825 podStartE2EDuration="35.49044112s" podCreationTimestamp="2026-03-17 11:30:48 +0000 UTC" firstStartedPulling="2026-03-17 11:31:11.794857113 +0000 UTC m=+1174.920984871" lastFinishedPulling="2026-03-17 11:31:18.221997408 +0000 UTC m=+1181.348125166" observedRunningTime="2026-03-17 11:31:23.483575607 +0000 UTC m=+1186.609703385" watchObservedRunningTime="2026-03-17 11:31:23.49044112 +0000 UTC m=+1186.616568898" Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.493084 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.765747 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l84hg"] Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.803573 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vpgmb"] Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.805388 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.818090 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.824493 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vpgmb"] Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.884231 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-pmxjd"] Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.885192 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.890236 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.896047 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pmxjd"] Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.929419 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-vpgmb\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.929562 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-vpgmb\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.929592 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c878n\" (UniqueName: \"kubernetes.io/projected/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-kube-api-access-c878n\") pod \"dnsmasq-dns-7f896c8c65-vpgmb\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:23 crc kubenswrapper[4742]: I0317 11:31:23.929614 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-config\") pod \"dnsmasq-dns-7f896c8c65-vpgmb\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.031127 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-vpgmb\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.031198 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-ovn-rundir\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.031257 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-ovs-rundir\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.031284 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-config\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.031316 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rkjq\" (UniqueName: \"kubernetes.io/projected/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-kube-api-access-9rkjq\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.031347 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-vpgmb\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.031364 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c878n\" (UniqueName: \"kubernetes.io/projected/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-kube-api-access-c878n\") pod \"dnsmasq-dns-7f896c8c65-vpgmb\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.031395 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-config\") pod \"dnsmasq-dns-7f896c8c65-vpgmb\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.031420 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-combined-ca-bundle\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.031441 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.032973 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-vpgmb\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.032971 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-config\") pod \"dnsmasq-dns-7f896c8c65-vpgmb\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.033374 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-vpgmb\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.046753 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c878n\" (UniqueName: \"kubernetes.io/projected/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-kube-api-access-c878n\") pod \"dnsmasq-dns-7f896c8c65-vpgmb\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.054194 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.132769 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-combined-ca-bundle\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.132807 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.132885 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-ovn-rundir\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.132931 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-ovs-rundir\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.132958 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-config\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.132974 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rkjq\" (UniqueName: \"kubernetes.io/projected/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-kube-api-access-9rkjq\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.133550 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.134242 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-ovn-rundir\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.134281 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-ovs-rundir\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.134392 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-config\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.136601 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-combined-ca-bundle\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.139358 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.150763 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rkjq\" (UniqueName: \"kubernetes.io/projected/0a50ef5e-ba73-4d00-baba-b8ef6c621d71-kube-api-access-9rkjq\") pod \"ovn-controller-metrics-pmxjd\" (UID: \"0a50ef5e-ba73-4d00-baba-b8ef6c621d71\") " pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.203693 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pmxjd" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.302888 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-4x7wf"] Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.325972 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7d25z"] Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.333983 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.337861 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.342804 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7d25z"] Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.437053 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.437131 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-config\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.437227 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcgd2\" (UniqueName: \"kubernetes.io/projected/3a91260a-abb6-4e26-b041-c39b36369405-kube-api-access-xcgd2\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.437281 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.437453 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.446576 4742 generic.go:334] "Generic (PLEG): container finished" podID="434c361d-ee53-4862-86c8-a0eddb1ae902" containerID="06786f0c491a80a2ad68941841042fb59d99cf6de74f120ff48d89c5cd8fb767" exitCode=0 Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.446630 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" event={"ID":"434c361d-ee53-4862-86c8-a0eddb1ae902","Type":"ContainerDied","Data":"06786f0c491a80a2ad68941841042fb59d99cf6de74f120ff48d89c5cd8fb767"} Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.449464 4742 generic.go:334] "Generic (PLEG): container finished" podID="3cecdb55-b664-4224-bad8-524bf97f879b" containerID="abe59923ff989f9d02259c6fd8407148ec0e31e60885828c313d370b8ae8a073" exitCode=0 Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.450212 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" event={"ID":"3cecdb55-b664-4224-bad8-524bf97f879b","Type":"ContainerDied","Data":"abe59923ff989f9d02259c6fd8407148ec0e31e60885828c313d370b8ae8a073"} Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.521163 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.539200 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcgd2\" (UniqueName: \"kubernetes.io/projected/3a91260a-abb6-4e26-b041-c39b36369405-kube-api-access-xcgd2\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.539258 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.539301 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.539445 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.539473 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-config\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.540713 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-config\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.540743 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.541384 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.541935 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.557918 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.566131 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcgd2\" (UniqueName: \"kubernetes.io/projected/3a91260a-abb6-4e26-b041-c39b36369405-kube-api-access-xcgd2\") pod \"dnsmasq-dns-86db49b7ff-7d25z\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.672168 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.679258 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vpgmb"] Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.802696 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pmxjd"] Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.889541 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.917425 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.951475 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/434c361d-ee53-4862-86c8-a0eddb1ae902-dns-svc\") pod \"434c361d-ee53-4862-86c8-a0eddb1ae902\" (UID: \"434c361d-ee53-4862-86c8-a0eddb1ae902\") " Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.951542 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96t68\" (UniqueName: \"kubernetes.io/projected/3cecdb55-b664-4224-bad8-524bf97f879b-kube-api-access-96t68\") pod \"3cecdb55-b664-4224-bad8-524bf97f879b\" (UID: \"3cecdb55-b664-4224-bad8-524bf97f879b\") " Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.951569 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/434c361d-ee53-4862-86c8-a0eddb1ae902-config\") pod \"434c361d-ee53-4862-86c8-a0eddb1ae902\" (UID: \"434c361d-ee53-4862-86c8-a0eddb1ae902\") " Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.951682 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cecdb55-b664-4224-bad8-524bf97f879b-config\") pod \"3cecdb55-b664-4224-bad8-524bf97f879b\" (UID: \"3cecdb55-b664-4224-bad8-524bf97f879b\") " Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.951723 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbt9s\" (UniqueName: \"kubernetes.io/projected/434c361d-ee53-4862-86c8-a0eddb1ae902-kube-api-access-pbt9s\") pod \"434c361d-ee53-4862-86c8-a0eddb1ae902\" (UID: \"434c361d-ee53-4862-86c8-a0eddb1ae902\") " Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.951757 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3cecdb55-b664-4224-bad8-524bf97f879b-dns-svc\") pod \"3cecdb55-b664-4224-bad8-524bf97f879b\" (UID: \"3cecdb55-b664-4224-bad8-524bf97f879b\") " Mar 17 11:31:24 crc kubenswrapper[4742]: I0317 11:31:24.955843 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/434c361d-ee53-4862-86c8-a0eddb1ae902-kube-api-access-pbt9s" (OuterVolumeSpecName: "kube-api-access-pbt9s") pod "434c361d-ee53-4862-86c8-a0eddb1ae902" (UID: "434c361d-ee53-4862-86c8-a0eddb1ae902"). InnerVolumeSpecName "kube-api-access-pbt9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.054726 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbt9s\" (UniqueName: \"kubernetes.io/projected/434c361d-ee53-4862-86c8-a0eddb1ae902-kube-api-access-pbt9s\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.054780 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cecdb55-b664-4224-bad8-524bf97f879b-kube-api-access-96t68" (OuterVolumeSpecName: "kube-api-access-96t68") pod "3cecdb55-b664-4224-bad8-524bf97f879b" (UID: "3cecdb55-b664-4224-bad8-524bf97f879b"). InnerVolumeSpecName "kube-api-access-96t68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.159046 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96t68\" (UniqueName: \"kubernetes.io/projected/3cecdb55-b664-4224-bad8-524bf97f879b-kube-api-access-96t68\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.195838 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7d25z"] Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.389466 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cecdb55-b664-4224-bad8-524bf97f879b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3cecdb55-b664-4224-bad8-524bf97f879b" (UID: "3cecdb55-b664-4224-bad8-524bf97f879b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.416624 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/434c361d-ee53-4862-86c8-a0eddb1ae902-config" (OuterVolumeSpecName: "config") pod "434c361d-ee53-4862-86c8-a0eddb1ae902" (UID: "434c361d-ee53-4862-86c8-a0eddb1ae902"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.421089 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/434c361d-ee53-4862-86c8-a0eddb1ae902-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "434c361d-ee53-4862-86c8-a0eddb1ae902" (UID: "434c361d-ee53-4862-86c8-a0eddb1ae902"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.437863 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cecdb55-b664-4224-bad8-524bf97f879b-config" (OuterVolumeSpecName: "config") pod "3cecdb55-b664-4224-bad8-524bf97f879b" (UID: "3cecdb55-b664-4224-bad8-524bf97f879b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.471166 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" event={"ID":"94c52795-0c3b-46fc-9e55-bd8b5c226f1e","Type":"ContainerStarted","Data":"efe5c1ec502fb8601cf9a9fe9a047425721213bb3f7d1e34c9cc83f4d5fac449"} Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.472604 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cecdb55-b664-4224-bad8-524bf97f879b-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.472641 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3cecdb55-b664-4224-bad8-524bf97f879b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.472657 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/434c361d-ee53-4862-86c8-a0eddb1ae902-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.472673 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/434c361d-ee53-4862-86c8-a0eddb1ae902-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.477216 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" event={"ID":"434c361d-ee53-4862-86c8-a0eddb1ae902","Type":"ContainerDied","Data":"58cde58720a1e96ea88081a646ba83b5685a343cc862d98a2131158f0ca39ce4"} Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.477269 4742 scope.go:117] "RemoveContainer" containerID="06786f0c491a80a2ad68941841042fb59d99cf6de74f120ff48d89c5cd8fb767" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.477401 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-4x7wf" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.485541 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" event={"ID":"3a91260a-abb6-4e26-b041-c39b36369405","Type":"ContainerStarted","Data":"25dd5612e6aae3591d98a70b3b5612c98bedc9989e5b0e013aa3e331619ea2ee"} Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.487492 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pmxjd" event={"ID":"0a50ef5e-ba73-4d00-baba-b8ef6c621d71","Type":"ContainerStarted","Data":"11a17efcba4276b05191022a7fbf7b1cd41484559b6b09e90b65bc96dfa86248"} Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.506924 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.507613 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-l84hg" event={"ID":"3cecdb55-b664-4224-bad8-524bf97f879b","Type":"ContainerDied","Data":"1b8b25ae34c93ed5025802661461f0e326549868c8b8a51fc69681aad26e4e5e"} Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.556518 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.610118 4742 scope.go:117] "RemoveContainer" containerID="abe59923ff989f9d02259c6fd8407148ec0e31e60885828c313d370b8ae8a073" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.731035 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-4x7wf"] Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.747348 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-4x7wf"] Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.772799 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l84hg"] Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.778068 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-l84hg"] Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.782367 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 17 11:31:25 crc kubenswrapper[4742]: E0317 11:31:25.783474 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="434c361d-ee53-4862-86c8-a0eddb1ae902" containerName="init" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.783495 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="434c361d-ee53-4862-86c8-a0eddb1ae902" containerName="init" Mar 17 11:31:25 crc kubenswrapper[4742]: E0317 11:31:25.783526 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cecdb55-b664-4224-bad8-524bf97f879b" containerName="init" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.783533 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cecdb55-b664-4224-bad8-524bf97f879b" containerName="init" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.783701 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cecdb55-b664-4224-bad8-524bf97f879b" containerName="init" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.783722 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="434c361d-ee53-4862-86c8-a0eddb1ae902" containerName="init" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.784457 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.786349 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-2x4j7" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.788725 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.789008 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.789215 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.789251 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.892694 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56194d57-077f-40f4-87f6-386942ac0f6b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.892753 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56194d57-077f-40f4-87f6-386942ac0f6b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.892790 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56194d57-077f-40f4-87f6-386942ac0f6b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.892813 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wxx5\" (UniqueName: \"kubernetes.io/projected/56194d57-077f-40f4-87f6-386942ac0f6b-kube-api-access-4wxx5\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.892840 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56194d57-077f-40f4-87f6-386942ac0f6b-scripts\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.892877 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56194d57-077f-40f4-87f6-386942ac0f6b-config\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.893110 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56194d57-077f-40f4-87f6-386942ac0f6b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.994301 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56194d57-077f-40f4-87f6-386942ac0f6b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.994595 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56194d57-077f-40f4-87f6-386942ac0f6b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.994632 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56194d57-077f-40f4-87f6-386942ac0f6b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.994651 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wxx5\" (UniqueName: \"kubernetes.io/projected/56194d57-077f-40f4-87f6-386942ac0f6b-kube-api-access-4wxx5\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.994676 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56194d57-077f-40f4-87f6-386942ac0f6b-scripts\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.994714 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56194d57-077f-40f4-87f6-386942ac0f6b-config\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.994747 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56194d57-077f-40f4-87f6-386942ac0f6b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.995431 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/56194d57-077f-40f4-87f6-386942ac0f6b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.995782 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/56194d57-077f-40f4-87f6-386942ac0f6b-scripts\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.996057 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56194d57-077f-40f4-87f6-386942ac0f6b-config\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.998403 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/56194d57-077f-40f4-87f6-386942ac0f6b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:25 crc kubenswrapper[4742]: I0317 11:31:25.998692 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/56194d57-077f-40f4-87f6-386942ac0f6b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:25.999487 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56194d57-077f-40f4-87f6-386942ac0f6b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.009568 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wxx5\" (UniqueName: \"kubernetes.io/projected/56194d57-077f-40f4-87f6-386942ac0f6b-kube-api-access-4wxx5\") pod \"ovn-northd-0\" (UID: \"56194d57-077f-40f4-87f6-386942ac0f6b\") " pod="openstack/ovn-northd-0" Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.112262 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.515073 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d71d306-a987-411e-82fe-e18450aa18a2","Type":"ContainerStarted","Data":"ae2be08fc5ec8464794b9d028f78ef7f5e9da6e8e3861cfa52e24654763af4af"} Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.517434 4742 generic.go:334] "Generic (PLEG): container finished" podID="3a91260a-abb6-4e26-b041-c39b36369405" containerID="f9239c89d4287fee5ffddf3435fae18b180b27f99abeacbf6b8fbe5c62625272" exitCode=0 Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.517498 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" event={"ID":"3a91260a-abb6-4e26-b041-c39b36369405","Type":"ContainerDied","Data":"f9239c89d4287fee5ffddf3435fae18b180b27f99abeacbf6b8fbe5c62625272"} Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.518686 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pmxjd" event={"ID":"0a50ef5e-ba73-4d00-baba-b8ef6c621d71","Type":"ContainerStarted","Data":"22462d2758153559aef3fbdb1007bbec3779f57d366447f8360751dd77011e6c"} Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.521303 4742 generic.go:334] "Generic (PLEG): container finished" podID="94c52795-0c3b-46fc-9e55-bd8b5c226f1e" containerID="99925e0d6ac7b4460cbdd28febc8b41fad71fbfb65f597e357c3890cebad4ff1" exitCode=0 Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.521505 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" event={"ID":"94c52795-0c3b-46fc-9e55-bd8b5c226f1e","Type":"ContainerDied","Data":"99925e0d6ac7b4460cbdd28febc8b41fad71fbfb65f597e357c3890cebad4ff1"} Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.524032 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6","Type":"ContainerStarted","Data":"2b56274b6b78ca4e5410d6fa294dba941d61ff2a15e2f2b60bc50b901df2e13d"} Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.580654 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-pmxjd" podStartSLOduration=3.580625997 podStartE2EDuration="3.580625997s" podCreationTimestamp="2026-03-17 11:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:31:26.5607453 +0000 UTC m=+1189.686873068" watchObservedRunningTime="2026-03-17 11:31:26.580625997 +0000 UTC m=+1189.706753775" Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.597761 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 17 11:31:26 crc kubenswrapper[4742]: W0317 11:31:26.643334 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56194d57_077f_40f4_87f6_386942ac0f6b.slice/crio-f6df235c9264018ec2657a7840eeb3528707bad0a6ddb5c9b18442459a5e85e0 WatchSource:0}: Error finding container f6df235c9264018ec2657a7840eeb3528707bad0a6ddb5c9b18442459a5e85e0: Status 404 returned error can't find the container with id f6df235c9264018ec2657a7840eeb3528707bad0a6ddb5c9b18442459a5e85e0 Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.696021 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cecdb55-b664-4224-bad8-524bf97f879b" path="/var/lib/kubelet/pods/3cecdb55-b664-4224-bad8-524bf97f879b/volumes" Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.696677 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="434c361d-ee53-4862-86c8-a0eddb1ae902" path="/var/lib/kubelet/pods/434c361d-ee53-4862-86c8-a0eddb1ae902/volumes" Mar 17 11:31:26 crc kubenswrapper[4742]: I0317 11:31:26.919887 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 17 11:31:27 crc kubenswrapper[4742]: I0317 11:31:27.531470 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56194d57-077f-40f4-87f6-386942ac0f6b","Type":"ContainerStarted","Data":"f6df235c9264018ec2657a7840eeb3528707bad0a6ddb5c9b18442459a5e85e0"} Mar 17 11:31:27 crc kubenswrapper[4742]: I0317 11:31:27.533967 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" event={"ID":"3a91260a-abb6-4e26-b041-c39b36369405","Type":"ContainerStarted","Data":"b55f0ae81d3c4dbb89d3d082ff9141409db2a32120c416f78cb9b535b2a1e22c"} Mar 17 11:31:27 crc kubenswrapper[4742]: I0317 11:31:27.534133 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:27 crc kubenswrapper[4742]: I0317 11:31:27.537247 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" event={"ID":"94c52795-0c3b-46fc-9e55-bd8b5c226f1e","Type":"ContainerStarted","Data":"d63c5d89829eec034d4de0c53e6e151b9394f9c5d4bef1ab89a477b0ca897de3"} Mar 17 11:31:27 crc kubenswrapper[4742]: I0317 11:31:27.575189 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" podStartSLOduration=4.575167725 podStartE2EDuration="4.575167725s" podCreationTimestamp="2026-03-17 11:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:31:27.57161529 +0000 UTC m=+1190.697743068" watchObservedRunningTime="2026-03-17 11:31:27.575167725 +0000 UTC m=+1190.701295483" Mar 17 11:31:27 crc kubenswrapper[4742]: I0317 11:31:27.578781 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" podStartSLOduration=3.57876944 podStartE2EDuration="3.57876944s" podCreationTimestamp="2026-03-17 11:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:31:27.556039177 +0000 UTC m=+1190.682166955" watchObservedRunningTime="2026-03-17 11:31:27.57876944 +0000 UTC m=+1190.704897198" Mar 17 11:31:28 crc kubenswrapper[4742]: I0317 11:31:28.548760 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56194d57-077f-40f4-87f6-386942ac0f6b","Type":"ContainerStarted","Data":"c314aeacf9055fcc6803dda05c154dd7b52abc4349eac02290f97d81de3b58ab"} Mar 17 11:31:28 crc kubenswrapper[4742]: I0317 11:31:28.549772 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:28 crc kubenswrapper[4742]: I0317 11:31:28.549802 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"56194d57-077f-40f4-87f6-386942ac0f6b","Type":"ContainerStarted","Data":"03d194d330252c6bff9fdaa1e7d88ad9708e76fc9af46c7078d96ed0203b9ae3"} Mar 17 11:31:28 crc kubenswrapper[4742]: I0317 11:31:28.578131 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.253926419 podStartE2EDuration="3.578105684s" podCreationTimestamp="2026-03-17 11:31:25 +0000 UTC" firstStartedPulling="2026-03-17 11:31:26.648191971 +0000 UTC m=+1189.774319729" lastFinishedPulling="2026-03-17 11:31:27.972371236 +0000 UTC m=+1191.098498994" observedRunningTime="2026-03-17 11:31:28.57230273 +0000 UTC m=+1191.698430528" watchObservedRunningTime="2026-03-17 11:31:28.578105684 +0000 UTC m=+1191.704233452" Mar 17 11:31:29 crc kubenswrapper[4742]: I0317 11:31:29.558918 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 17 11:31:30 crc kubenswrapper[4742]: I0317 11:31:30.375246 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 17 11:31:30 crc kubenswrapper[4742]: I0317 11:31:30.375607 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 17 11:31:30 crc kubenswrapper[4742]: I0317 11:31:30.469718 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 17 11:31:30 crc kubenswrapper[4742]: I0317 11:31:30.633296 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 17 11:31:31 crc kubenswrapper[4742]: I0317 11:31:31.618539 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 17 11:31:31 crc kubenswrapper[4742]: I0317 11:31:31.619333 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 17 11:31:31 crc kubenswrapper[4742]: I0317 11:31:31.716413 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 17 11:31:32 crc kubenswrapper[4742]: I0317 11:31:32.784019 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 17 11:31:32 crc kubenswrapper[4742]: I0317 11:31:32.892397 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-065f-account-create-update-cxjdl"] Mar 17 11:31:32 crc kubenswrapper[4742]: I0317 11:31:32.893312 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-065f-account-create-update-cxjdl" Mar 17 11:31:32 crc kubenswrapper[4742]: I0317 11:31:32.894864 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 17 11:31:32 crc kubenswrapper[4742]: I0317 11:31:32.914241 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-065f-account-create-update-cxjdl"] Mar 17 11:31:32 crc kubenswrapper[4742]: I0317 11:31:32.922683 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-97pcv"] Mar 17 11:31:32 crc kubenswrapper[4742]: I0317 11:31:32.923743 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-97pcv" Mar 17 11:31:32 crc kubenswrapper[4742]: I0317 11:31:32.944544 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-97pcv"] Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.019416 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19a4cab3-fe96-424d-a768-741b2c01d8e0-operator-scripts\") pod \"keystone-065f-account-create-update-cxjdl\" (UID: \"19a4cab3-fe96-424d-a768-741b2c01d8e0\") " pod="openstack/keystone-065f-account-create-update-cxjdl" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.019483 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xxhn\" (UniqueName: \"kubernetes.io/projected/d80d7e30-5242-48a7-b61b-6e3d74364128-kube-api-access-7xxhn\") pod \"keystone-db-create-97pcv\" (UID: \"d80d7e30-5242-48a7-b61b-6e3d74364128\") " pod="openstack/keystone-db-create-97pcv" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.019522 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d80d7e30-5242-48a7-b61b-6e3d74364128-operator-scripts\") pod \"keystone-db-create-97pcv\" (UID: \"d80d7e30-5242-48a7-b61b-6e3d74364128\") " pod="openstack/keystone-db-create-97pcv" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.019638 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjffj\" (UniqueName: \"kubernetes.io/projected/19a4cab3-fe96-424d-a768-741b2c01d8e0-kube-api-access-sjffj\") pod \"keystone-065f-account-create-update-cxjdl\" (UID: \"19a4cab3-fe96-424d-a768-741b2c01d8e0\") " pod="openstack/keystone-065f-account-create-update-cxjdl" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.054840 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-wp4gd"] Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.055750 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wp4gd" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.064493 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wp4gd"] Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.120913 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19a4cab3-fe96-424d-a768-741b2c01d8e0-operator-scripts\") pod \"keystone-065f-account-create-update-cxjdl\" (UID: \"19a4cab3-fe96-424d-a768-741b2c01d8e0\") " pod="openstack/keystone-065f-account-create-update-cxjdl" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.120964 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28db6ae8-7bd6-4c58-9b49-7349349da904-operator-scripts\") pod \"placement-db-create-wp4gd\" (UID: \"28db6ae8-7bd6-4c58-9b49-7349349da904\") " pod="openstack/placement-db-create-wp4gd" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.121002 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xxhn\" (UniqueName: \"kubernetes.io/projected/d80d7e30-5242-48a7-b61b-6e3d74364128-kube-api-access-7xxhn\") pod \"keystone-db-create-97pcv\" (UID: \"d80d7e30-5242-48a7-b61b-6e3d74364128\") " pod="openstack/keystone-db-create-97pcv" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.121031 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d80d7e30-5242-48a7-b61b-6e3d74364128-operator-scripts\") pod \"keystone-db-create-97pcv\" (UID: \"d80d7e30-5242-48a7-b61b-6e3d74364128\") " pod="openstack/keystone-db-create-97pcv" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.121067 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd7gs\" (UniqueName: \"kubernetes.io/projected/28db6ae8-7bd6-4c58-9b49-7349349da904-kube-api-access-wd7gs\") pod \"placement-db-create-wp4gd\" (UID: \"28db6ae8-7bd6-4c58-9b49-7349349da904\") " pod="openstack/placement-db-create-wp4gd" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.121115 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjffj\" (UniqueName: \"kubernetes.io/projected/19a4cab3-fe96-424d-a768-741b2c01d8e0-kube-api-access-sjffj\") pod \"keystone-065f-account-create-update-cxjdl\" (UID: \"19a4cab3-fe96-424d-a768-741b2c01d8e0\") " pod="openstack/keystone-065f-account-create-update-cxjdl" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.121974 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19a4cab3-fe96-424d-a768-741b2c01d8e0-operator-scripts\") pod \"keystone-065f-account-create-update-cxjdl\" (UID: \"19a4cab3-fe96-424d-a768-741b2c01d8e0\") " pod="openstack/keystone-065f-account-create-update-cxjdl" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.122765 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d80d7e30-5242-48a7-b61b-6e3d74364128-operator-scripts\") pod \"keystone-db-create-97pcv\" (UID: \"d80d7e30-5242-48a7-b61b-6e3d74364128\") " pod="openstack/keystone-db-create-97pcv" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.139097 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xxhn\" (UniqueName: \"kubernetes.io/projected/d80d7e30-5242-48a7-b61b-6e3d74364128-kube-api-access-7xxhn\") pod \"keystone-db-create-97pcv\" (UID: \"d80d7e30-5242-48a7-b61b-6e3d74364128\") " pod="openstack/keystone-db-create-97pcv" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.146294 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjffj\" (UniqueName: \"kubernetes.io/projected/19a4cab3-fe96-424d-a768-741b2c01d8e0-kube-api-access-sjffj\") pod \"keystone-065f-account-create-update-cxjdl\" (UID: \"19a4cab3-fe96-424d-a768-741b2c01d8e0\") " pod="openstack/keystone-065f-account-create-update-cxjdl" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.192894 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-18db-account-create-update-psfh5"] Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.195292 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-18db-account-create-update-psfh5" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.198556 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.199470 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-18db-account-create-update-psfh5"] Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.215466 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-065f-account-create-update-cxjdl" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.222175 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28db6ae8-7bd6-4c58-9b49-7349349da904-operator-scripts\") pod \"placement-db-create-wp4gd\" (UID: \"28db6ae8-7bd6-4c58-9b49-7349349da904\") " pod="openstack/placement-db-create-wp4gd" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.222243 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd7gs\" (UniqueName: \"kubernetes.io/projected/28db6ae8-7bd6-4c58-9b49-7349349da904-kube-api-access-wd7gs\") pod \"placement-db-create-wp4gd\" (UID: \"28db6ae8-7bd6-4c58-9b49-7349349da904\") " pod="openstack/placement-db-create-wp4gd" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.223149 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28db6ae8-7bd6-4c58-9b49-7349349da904-operator-scripts\") pod \"placement-db-create-wp4gd\" (UID: \"28db6ae8-7bd6-4c58-9b49-7349349da904\") " pod="openstack/placement-db-create-wp4gd" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.238656 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd7gs\" (UniqueName: \"kubernetes.io/projected/28db6ae8-7bd6-4c58-9b49-7349349da904-kube-api-access-wd7gs\") pod \"placement-db-create-wp4gd\" (UID: \"28db6ae8-7bd6-4c58-9b49-7349349da904\") " pod="openstack/placement-db-create-wp4gd" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.243019 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-97pcv" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.323910 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvgfc\" (UniqueName: \"kubernetes.io/projected/d84251cb-eea2-41f7-b743-ab3a4d0c4ae1-kube-api-access-hvgfc\") pod \"placement-18db-account-create-update-psfh5\" (UID: \"d84251cb-eea2-41f7-b743-ab3a4d0c4ae1\") " pod="openstack/placement-18db-account-create-update-psfh5" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.324072 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d84251cb-eea2-41f7-b743-ab3a4d0c4ae1-operator-scripts\") pod \"placement-18db-account-create-update-psfh5\" (UID: \"d84251cb-eea2-41f7-b743-ab3a4d0c4ae1\") " pod="openstack/placement-18db-account-create-update-psfh5" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.374993 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wp4gd" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.440057 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvgfc\" (UniqueName: \"kubernetes.io/projected/d84251cb-eea2-41f7-b743-ab3a4d0c4ae1-kube-api-access-hvgfc\") pod \"placement-18db-account-create-update-psfh5\" (UID: \"d84251cb-eea2-41f7-b743-ab3a4d0c4ae1\") " pod="openstack/placement-18db-account-create-update-psfh5" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.440252 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d84251cb-eea2-41f7-b743-ab3a4d0c4ae1-operator-scripts\") pod \"placement-18db-account-create-update-psfh5\" (UID: \"d84251cb-eea2-41f7-b743-ab3a4d0c4ae1\") " pod="openstack/placement-18db-account-create-update-psfh5" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.441523 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d84251cb-eea2-41f7-b743-ab3a4d0c4ae1-operator-scripts\") pod \"placement-18db-account-create-update-psfh5\" (UID: \"d84251cb-eea2-41f7-b743-ab3a4d0c4ae1\") " pod="openstack/placement-18db-account-create-update-psfh5" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.461331 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvgfc\" (UniqueName: \"kubernetes.io/projected/d84251cb-eea2-41f7-b743-ab3a4d0c4ae1-kube-api-access-hvgfc\") pod \"placement-18db-account-create-update-psfh5\" (UID: \"d84251cb-eea2-41f7-b743-ab3a4d0c4ae1\") " pod="openstack/placement-18db-account-create-update-psfh5" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.659905 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-18db-account-create-update-psfh5" Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.682992 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-065f-account-create-update-cxjdl"] Mar 17 11:31:33 crc kubenswrapper[4742]: W0317 11:31:33.684223 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19a4cab3_fe96_424d_a768_741b2c01d8e0.slice/crio-958d5d79bdcc5638c3e8869e73bcfedf57776b7a8c6a31192259f2710fd88e9b WatchSource:0}: Error finding container 958d5d79bdcc5638c3e8869e73bcfedf57776b7a8c6a31192259f2710fd88e9b: Status 404 returned error can't find the container with id 958d5d79bdcc5638c3e8869e73bcfedf57776b7a8c6a31192259f2710fd88e9b Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.763858 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-97pcv"] Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.856795 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wp4gd"] Mar 17 11:31:33 crc kubenswrapper[4742]: W0317 11:31:33.863460 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28db6ae8_7bd6_4c58_9b49_7349349da904.slice/crio-e4147851e1e4a9498f260931e779529e0b4a005e62d60b1eca73846a8492d452 WatchSource:0}: Error finding container e4147851e1e4a9498f260931e779529e0b4a005e62d60b1eca73846a8492d452: Status 404 returned error can't find the container with id e4147851e1e4a9498f260931e779529e0b4a005e62d60b1eca73846a8492d452 Mar 17 11:31:33 crc kubenswrapper[4742]: I0317 11:31:33.903716 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-18db-account-create-update-psfh5"] Mar 17 11:31:33 crc kubenswrapper[4742]: W0317 11:31:33.909327 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd84251cb_eea2_41f7_b743_ab3a4d0c4ae1.slice/crio-10b33dd47956d8dfe194968c039e179d5f8faae604035f0390d90ca1a9232d17 WatchSource:0}: Error finding container 10b33dd47956d8dfe194968c039e179d5f8faae604035f0390d90ca1a9232d17: Status 404 returned error can't find the container with id 10b33dd47956d8dfe194968c039e179d5f8faae604035f0390d90ca1a9232d17 Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.125509 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vpgmb"] Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.126180 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" podUID="94c52795-0c3b-46fc-9e55-bd8b5c226f1e" containerName="dnsmasq-dns" containerID="cri-o://d63c5d89829eec034d4de0c53e6e151b9394f9c5d4bef1ab89a477b0ca897de3" gracePeriod=10 Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.140830 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.187602 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-ldqhg"] Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.188874 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.212741 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-ldqhg"] Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.260754 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-config\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.261155 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.261198 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-dns-svc\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.261217 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bftxm\" (UniqueName: \"kubernetes.io/projected/d5b7d712-b6f0-43e1-a95b-e49251608407-kube-api-access-bftxm\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.261234 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.363207 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.363478 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-dns-svc\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.363581 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bftxm\" (UniqueName: \"kubernetes.io/projected/d5b7d712-b6f0-43e1-a95b-e49251608407-kube-api-access-bftxm\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.363671 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.363812 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-config\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.365008 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-dns-svc\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.366297 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-config\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.366463 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.367819 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.394084 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bftxm\" (UniqueName: \"kubernetes.io/projected/d5b7d712-b6f0-43e1-a95b-e49251608407-kube-api-access-bftxm\") pod \"dnsmasq-dns-698758b865-ldqhg\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.607943 4742 generic.go:334] "Generic (PLEG): container finished" podID="d80d7e30-5242-48a7-b61b-6e3d74364128" containerID="e29ae2e2808df9beb7293f4ecf1cda6fd49e1a8e0254b2fdfa6cae19752cba69" exitCode=0 Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.609018 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-97pcv" event={"ID":"d80d7e30-5242-48a7-b61b-6e3d74364128","Type":"ContainerDied","Data":"e29ae2e2808df9beb7293f4ecf1cda6fd49e1a8e0254b2fdfa6cae19752cba69"} Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.609041 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-97pcv" event={"ID":"d80d7e30-5242-48a7-b61b-6e3d74364128","Type":"ContainerStarted","Data":"f6861f97afe01b21104adfbd2239a278b42907a72cfe3aed757e085d60d02a4f"} Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.612694 4742 generic.go:334] "Generic (PLEG): container finished" podID="d84251cb-eea2-41f7-b743-ab3a4d0c4ae1" containerID="dc053eca8afdef59c2d596b1cd594e09c250119abb63e9f7c1a1de6724d9bac1" exitCode=0 Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.612761 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-18db-account-create-update-psfh5" event={"ID":"d84251cb-eea2-41f7-b743-ab3a4d0c4ae1","Type":"ContainerDied","Data":"dc053eca8afdef59c2d596b1cd594e09c250119abb63e9f7c1a1de6724d9bac1"} Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.612788 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-18db-account-create-update-psfh5" event={"ID":"d84251cb-eea2-41f7-b743-ab3a4d0c4ae1","Type":"ContainerStarted","Data":"10b33dd47956d8dfe194968c039e179d5f8faae604035f0390d90ca1a9232d17"} Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.614279 4742 generic.go:334] "Generic (PLEG): container finished" podID="19a4cab3-fe96-424d-a768-741b2c01d8e0" containerID="a9f80e4999e79f490ef91c75e4f6c00600be5edfe0a5f2a06f1503c01a02c9de" exitCode=0 Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.614363 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-065f-account-create-update-cxjdl" event={"ID":"19a4cab3-fe96-424d-a768-741b2c01d8e0","Type":"ContainerDied","Data":"a9f80e4999e79f490ef91c75e4f6c00600be5edfe0a5f2a06f1503c01a02c9de"} Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.614382 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-065f-account-create-update-cxjdl" event={"ID":"19a4cab3-fe96-424d-a768-741b2c01d8e0","Type":"ContainerStarted","Data":"958d5d79bdcc5638c3e8869e73bcfedf57776b7a8c6a31192259f2710fd88e9b"} Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.615794 4742 generic.go:334] "Generic (PLEG): container finished" podID="94c52795-0c3b-46fc-9e55-bd8b5c226f1e" containerID="d63c5d89829eec034d4de0c53e6e151b9394f9c5d4bef1ab89a477b0ca897de3" exitCode=0 Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.615874 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" event={"ID":"94c52795-0c3b-46fc-9e55-bd8b5c226f1e","Type":"ContainerDied","Data":"d63c5d89829eec034d4de0c53e6e151b9394f9c5d4bef1ab89a477b0ca897de3"} Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.617165 4742 generic.go:334] "Generic (PLEG): container finished" podID="28db6ae8-7bd6-4c58-9b49-7349349da904" containerID="5b537e8a453a925bd038aa6d28e37d38d71cac184d10382ddefd9f7537a455a0" exitCode=0 Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.617193 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wp4gd" event={"ID":"28db6ae8-7bd6-4c58-9b49-7349349da904","Type":"ContainerDied","Data":"5b537e8a453a925bd038aa6d28e37d38d71cac184d10382ddefd9f7537a455a0"} Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.617207 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wp4gd" event={"ID":"28db6ae8-7bd6-4c58-9b49-7349349da904","Type":"ContainerStarted","Data":"e4147851e1e4a9498f260931e779529e0b4a005e62d60b1eca73846a8492d452"} Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.640572 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.688167 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:34 crc kubenswrapper[4742]: E0317 11:31:34.703133 4742 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28db6ae8_7bd6_4c58_9b49_7349349da904.slice/crio-conmon-5b537e8a453a925bd038aa6d28e37d38d71cac184d10382ddefd9f7537a455a0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd84251cb_eea2_41f7_b743_ab3a4d0c4ae1.slice/crio-dc053eca8afdef59c2d596b1cd594e09c250119abb63e9f7c1a1de6724d9bac1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28db6ae8_7bd6_4c58_9b49_7349349da904.slice/crio-5b537e8a453a925bd038aa6d28e37d38d71cac184d10382ddefd9f7537a455a0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd84251cb_eea2_41f7_b743_ab3a4d0c4ae1.slice/crio-conmon-dc053eca8afdef59c2d596b1cd594e09c250119abb63e9f7c1a1de6724d9bac1.scope\": RecentStats: unable to find data in memory cache]" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.729198 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.771774 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c878n\" (UniqueName: \"kubernetes.io/projected/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-kube-api-access-c878n\") pod \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.771808 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-config\") pod \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.772000 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-ovsdbserver-sb\") pod \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.772045 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-dns-svc\") pod \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\" (UID: \"94c52795-0c3b-46fc-9e55-bd8b5c226f1e\") " Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.779824 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-kube-api-access-c878n" (OuterVolumeSpecName: "kube-api-access-c878n") pod "94c52795-0c3b-46fc-9e55-bd8b5c226f1e" (UID: "94c52795-0c3b-46fc-9e55-bd8b5c226f1e"). InnerVolumeSpecName "kube-api-access-c878n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.818654 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "94c52795-0c3b-46fc-9e55-bd8b5c226f1e" (UID: "94c52795-0c3b-46fc-9e55-bd8b5c226f1e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.832107 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "94c52795-0c3b-46fc-9e55-bd8b5c226f1e" (UID: "94c52795-0c3b-46fc-9e55-bd8b5c226f1e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.848268 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-config" (OuterVolumeSpecName: "config") pod "94c52795-0c3b-46fc-9e55-bd8b5c226f1e" (UID: "94c52795-0c3b-46fc-9e55-bd8b5c226f1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.880962 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.880993 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.881003 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c878n\" (UniqueName: \"kubernetes.io/projected/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-kube-api-access-c878n\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:34 crc kubenswrapper[4742]: I0317 11:31:34.881014 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94c52795-0c3b-46fc-9e55-bd8b5c226f1e-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.189650 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-ldqhg"] Mar 17 11:31:35 crc kubenswrapper[4742]: W0317 11:31:35.192629 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5b7d712_b6f0_43e1_a95b_e49251608407.slice/crio-679b8cc74a75413f4ed61f065f1a70f325cc700c29f59c2bc94c38e560c74dea WatchSource:0}: Error finding container 679b8cc74a75413f4ed61f065f1a70f325cc700c29f59c2bc94c38e560c74dea: Status 404 returned error can't find the container with id 679b8cc74a75413f4ed61f065f1a70f325cc700c29f59c2bc94c38e560c74dea Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.275833 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 17 11:31:35 crc kubenswrapper[4742]: E0317 11:31:35.276157 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c52795-0c3b-46fc-9e55-bd8b5c226f1e" containerName="init" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.276175 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c52795-0c3b-46fc-9e55-bd8b5c226f1e" containerName="init" Mar 17 11:31:35 crc kubenswrapper[4742]: E0317 11:31:35.276207 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c52795-0c3b-46fc-9e55-bd8b5c226f1e" containerName="dnsmasq-dns" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.276214 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c52795-0c3b-46fc-9e55-bd8b5c226f1e" containerName="dnsmasq-dns" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.276391 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c52795-0c3b-46fc-9e55-bd8b5c226f1e" containerName="dnsmasq-dns" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.280454 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.282373 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.282576 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.282711 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.287823 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-kdr7s" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.307647 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.387727 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be22c821-2e25-47ed-938d-c748fc55a4c6-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.388049 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/be22c821-2e25-47ed-938d-c748fc55a4c6-lock\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.388066 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.388088 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.388169 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/be22c821-2e25-47ed-938d-c748fc55a4c6-cache\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.388358 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5lg9\" (UniqueName: \"kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-kube-api-access-q5lg9\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.490430 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/be22c821-2e25-47ed-938d-c748fc55a4c6-cache\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.490479 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5lg9\" (UniqueName: \"kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-kube-api-access-q5lg9\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.490533 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be22c821-2e25-47ed-938d-c748fc55a4c6-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.490578 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/be22c821-2e25-47ed-938d-c748fc55a4c6-lock\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.490602 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.490629 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.490952 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.491691 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/be22c821-2e25-47ed-938d-c748fc55a4c6-cache\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.491983 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/be22c821-2e25-47ed-938d-c748fc55a4c6-lock\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: E0317 11:31:35.492082 4742 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 11:31:35 crc kubenswrapper[4742]: E0317 11:31:35.492105 4742 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 11:31:35 crc kubenswrapper[4742]: E0317 11:31:35.492149 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift podName:be22c821-2e25-47ed-938d-c748fc55a4c6 nodeName:}" failed. No retries permitted until 2026-03-17 11:31:35.992132281 +0000 UTC m=+1199.118260049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift") pod "swift-storage-0" (UID: "be22c821-2e25-47ed-938d-c748fc55a4c6") : configmap "swift-ring-files" not found Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.497945 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be22c821-2e25-47ed-938d-c748fc55a4c6-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.520577 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.520814 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5lg9\" (UniqueName: \"kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-kube-api-access-q5lg9\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.633788 4742 generic.go:334] "Generic (PLEG): container finished" podID="d5b7d712-b6f0-43e1-a95b-e49251608407" containerID="19c851e05f0711c436c16a16a9aae644a7b365716b32c838639893b99dc976a6" exitCode=0 Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.633864 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-ldqhg" event={"ID":"d5b7d712-b6f0-43e1-a95b-e49251608407","Type":"ContainerDied","Data":"19c851e05f0711c436c16a16a9aae644a7b365716b32c838639893b99dc976a6"} Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.633895 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-ldqhg" event={"ID":"d5b7d712-b6f0-43e1-a95b-e49251608407","Type":"ContainerStarted","Data":"679b8cc74a75413f4ed61f065f1a70f325cc700c29f59c2bc94c38e560c74dea"} Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.647419 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.648554 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vpgmb" event={"ID":"94c52795-0c3b-46fc-9e55-bd8b5c226f1e","Type":"ContainerDied","Data":"efe5c1ec502fb8601cf9a9fe9a047425721213bb3f7d1e34c9cc83f4d5fac449"} Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.648688 4742 scope.go:117] "RemoveContainer" containerID="d63c5d89829eec034d4de0c53e6e151b9394f9c5d4bef1ab89a477b0ca897de3" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.694723 4742 scope.go:117] "RemoveContainer" containerID="99925e0d6ac7b4460cbdd28febc8b41fad71fbfb65f597e357c3890cebad4ff1" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.705529 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vpgmb"] Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.714775 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vpgmb"] Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.808734 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-rrnw9"] Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.810102 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.814000 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.814232 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.814418 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.829725 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rrnw9"] Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.902509 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-etc-swift\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.902585 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-swiftconf\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.902631 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx4ql\" (UniqueName: \"kubernetes.io/projected/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-kube-api-access-fx4ql\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.902655 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-combined-ca-bundle\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.902675 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-ring-data-devices\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.902705 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-dispersionconf\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.902733 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-scripts\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:35 crc kubenswrapper[4742]: I0317 11:31:35.998117 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-065f-account-create-update-cxjdl" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.003839 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-swiftconf\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.004121 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.004271 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx4ql\" (UniqueName: \"kubernetes.io/projected/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-kube-api-access-fx4ql\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.004385 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-combined-ca-bundle\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: E0317 11:31:36.004492 4742 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 11:31:36 crc kubenswrapper[4742]: E0317 11:31:36.004527 4742 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 11:31:36 crc kubenswrapper[4742]: E0317 11:31:36.004586 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift podName:be22c821-2e25-47ed-938d-c748fc55a4c6 nodeName:}" failed. No retries permitted until 2026-03-17 11:31:37.004564482 +0000 UTC m=+1200.130692330 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift") pod "swift-storage-0" (UID: "be22c821-2e25-47ed-938d-c748fc55a4c6") : configmap "swift-ring-files" not found Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.004495 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-dispersionconf\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.004644 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-ring-data-devices\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.004728 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-scripts\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.004872 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-etc-swift\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.006192 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-scripts\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.006193 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-ring-data-devices\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.006400 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-etc-swift\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.011475 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-dispersionconf\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.012146 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-combined-ca-bundle\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.017163 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-swiftconf\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.027719 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx4ql\" (UniqueName: \"kubernetes.io/projected/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-kube-api-access-fx4ql\") pod \"swift-ring-rebalance-rrnw9\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.094214 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wp4gd" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.106349 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjffj\" (UniqueName: \"kubernetes.io/projected/19a4cab3-fe96-424d-a768-741b2c01d8e0-kube-api-access-sjffj\") pod \"19a4cab3-fe96-424d-a768-741b2c01d8e0\" (UID: \"19a4cab3-fe96-424d-a768-741b2c01d8e0\") " Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.106436 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19a4cab3-fe96-424d-a768-741b2c01d8e0-operator-scripts\") pod \"19a4cab3-fe96-424d-a768-741b2c01d8e0\" (UID: \"19a4cab3-fe96-424d-a768-741b2c01d8e0\") " Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.107000 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19a4cab3-fe96-424d-a768-741b2c01d8e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19a4cab3-fe96-424d-a768-741b2c01d8e0" (UID: "19a4cab3-fe96-424d-a768-741b2c01d8e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.154502 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.157148 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19a4cab3-fe96-424d-a768-741b2c01d8e0-kube-api-access-sjffj" (OuterVolumeSpecName: "kube-api-access-sjffj") pod "19a4cab3-fe96-424d-a768-741b2c01d8e0" (UID: "19a4cab3-fe96-424d-a768-741b2c01d8e0"). InnerVolumeSpecName "kube-api-access-sjffj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.158664 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-97pcv" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.208218 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28db6ae8-7bd6-4c58-9b49-7349349da904-operator-scripts\") pod \"28db6ae8-7bd6-4c58-9b49-7349349da904\" (UID: \"28db6ae8-7bd6-4c58-9b49-7349349da904\") " Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.208320 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd7gs\" (UniqueName: \"kubernetes.io/projected/28db6ae8-7bd6-4c58-9b49-7349349da904-kube-api-access-wd7gs\") pod \"28db6ae8-7bd6-4c58-9b49-7349349da904\" (UID: \"28db6ae8-7bd6-4c58-9b49-7349349da904\") " Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.208362 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xxhn\" (UniqueName: \"kubernetes.io/projected/d80d7e30-5242-48a7-b61b-6e3d74364128-kube-api-access-7xxhn\") pod \"d80d7e30-5242-48a7-b61b-6e3d74364128\" (UID: \"d80d7e30-5242-48a7-b61b-6e3d74364128\") " Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.208466 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d80d7e30-5242-48a7-b61b-6e3d74364128-operator-scripts\") pod \"d80d7e30-5242-48a7-b61b-6e3d74364128\" (UID: \"d80d7e30-5242-48a7-b61b-6e3d74364128\") " Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.208828 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19a4cab3-fe96-424d-a768-741b2c01d8e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.208846 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjffj\" (UniqueName: \"kubernetes.io/projected/19a4cab3-fe96-424d-a768-741b2c01d8e0-kube-api-access-sjffj\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.209303 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d80d7e30-5242-48a7-b61b-6e3d74364128-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d80d7e30-5242-48a7-b61b-6e3d74364128" (UID: "d80d7e30-5242-48a7-b61b-6e3d74364128"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.209443 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28db6ae8-7bd6-4c58-9b49-7349349da904-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "28db6ae8-7bd6-4c58-9b49-7349349da904" (UID: "28db6ae8-7bd6-4c58-9b49-7349349da904"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.212196 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d80d7e30-5242-48a7-b61b-6e3d74364128-kube-api-access-7xxhn" (OuterVolumeSpecName: "kube-api-access-7xxhn") pod "d80d7e30-5242-48a7-b61b-6e3d74364128" (UID: "d80d7e30-5242-48a7-b61b-6e3d74364128"). InnerVolumeSpecName "kube-api-access-7xxhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.215107 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28db6ae8-7bd6-4c58-9b49-7349349da904-kube-api-access-wd7gs" (OuterVolumeSpecName: "kube-api-access-wd7gs") pod "28db6ae8-7bd6-4c58-9b49-7349349da904" (UID: "28db6ae8-7bd6-4c58-9b49-7349349da904"). InnerVolumeSpecName "kube-api-access-wd7gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.288931 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-18db-account-create-update-psfh5" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.309885 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvgfc\" (UniqueName: \"kubernetes.io/projected/d84251cb-eea2-41f7-b743-ab3a4d0c4ae1-kube-api-access-hvgfc\") pod \"d84251cb-eea2-41f7-b743-ab3a4d0c4ae1\" (UID: \"d84251cb-eea2-41f7-b743-ab3a4d0c4ae1\") " Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.310111 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d84251cb-eea2-41f7-b743-ab3a4d0c4ae1-operator-scripts\") pod \"d84251cb-eea2-41f7-b743-ab3a4d0c4ae1\" (UID: \"d84251cb-eea2-41f7-b743-ab3a4d0c4ae1\") " Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.310563 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d80d7e30-5242-48a7-b61b-6e3d74364128-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.310578 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/28db6ae8-7bd6-4c58-9b49-7349349da904-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.310588 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd7gs\" (UniqueName: \"kubernetes.io/projected/28db6ae8-7bd6-4c58-9b49-7349349da904-kube-api-access-wd7gs\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.310596 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xxhn\" (UniqueName: \"kubernetes.io/projected/d80d7e30-5242-48a7-b61b-6e3d74364128-kube-api-access-7xxhn\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.310812 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d84251cb-eea2-41f7-b743-ab3a4d0c4ae1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d84251cb-eea2-41f7-b743-ab3a4d0c4ae1" (UID: "d84251cb-eea2-41f7-b743-ab3a4d0c4ae1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.323189 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d84251cb-eea2-41f7-b743-ab3a4d0c4ae1-kube-api-access-hvgfc" (OuterVolumeSpecName: "kube-api-access-hvgfc") pod "d84251cb-eea2-41f7-b743-ab3a4d0c4ae1" (UID: "d84251cb-eea2-41f7-b743-ab3a4d0c4ae1"). InnerVolumeSpecName "kube-api-access-hvgfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.412466 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvgfc\" (UniqueName: \"kubernetes.io/projected/d84251cb-eea2-41f7-b743-ab3a4d0c4ae1-kube-api-access-hvgfc\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.412510 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d84251cb-eea2-41f7-b743-ab3a4d0c4ae1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.613057 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rrnw9"] Mar 17 11:31:36 crc kubenswrapper[4742]: W0317 11:31:36.624072 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cc5195f_ecc0_4f8e_bc53_ea602fff501d.slice/crio-8ec4a550266ce3278a1377fb8af8b462d79a152e45f3fb9ae70c819ea211977b WatchSource:0}: Error finding container 8ec4a550266ce3278a1377fb8af8b462d79a152e45f3fb9ae70c819ea211977b: Status 404 returned error can't find the container with id 8ec4a550266ce3278a1377fb8af8b462d79a152e45f3fb9ae70c819ea211977b Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.656276 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-ldqhg" event={"ID":"d5b7d712-b6f0-43e1-a95b-e49251608407","Type":"ContainerStarted","Data":"132e6829f0471b35d024d8b51add4272475ab6913eef83c75aa27597272f3deb"} Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.656649 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.657825 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-065f-account-create-update-cxjdl" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.657822 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-065f-account-create-update-cxjdl" event={"ID":"19a4cab3-fe96-424d-a768-741b2c01d8e0","Type":"ContainerDied","Data":"958d5d79bdcc5638c3e8869e73bcfedf57776b7a8c6a31192259f2710fd88e9b"} Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.657950 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="958d5d79bdcc5638c3e8869e73bcfedf57776b7a8c6a31192259f2710fd88e9b" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.661358 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wp4gd" event={"ID":"28db6ae8-7bd6-4c58-9b49-7349349da904","Type":"ContainerDied","Data":"e4147851e1e4a9498f260931e779529e0b4a005e62d60b1eca73846a8492d452"} Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.661384 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4147851e1e4a9498f260931e779529e0b4a005e62d60b1eca73846a8492d452" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.661429 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wp4gd" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.672664 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-97pcv" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.680351 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-18db-account-create-update-psfh5" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.683115 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94c52795-0c3b-46fc-9e55-bd8b5c226f1e" path="/var/lib/kubelet/pods/94c52795-0c3b-46fc-9e55-bd8b5c226f1e/volumes" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.684233 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-ldqhg" podStartSLOduration=2.684219011 podStartE2EDuration="2.684219011s" podCreationTimestamp="2026-03-17 11:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:31:36.678693234 +0000 UTC m=+1199.804820992" watchObservedRunningTime="2026-03-17 11:31:36.684219011 +0000 UTC m=+1199.810346769" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.684683 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-97pcv" event={"ID":"d80d7e30-5242-48a7-b61b-6e3d74364128","Type":"ContainerDied","Data":"f6861f97afe01b21104adfbd2239a278b42907a72cfe3aed757e085d60d02a4f"} Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.684728 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6861f97afe01b21104adfbd2239a278b42907a72cfe3aed757e085d60d02a4f" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.684747 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-18db-account-create-update-psfh5" event={"ID":"d84251cb-eea2-41f7-b743-ab3a4d0c4ae1","Type":"ContainerDied","Data":"10b33dd47956d8dfe194968c039e179d5f8faae604035f0390d90ca1a9232d17"} Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.684764 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10b33dd47956d8dfe194968c039e179d5f8faae604035f0390d90ca1a9232d17" Mar 17 11:31:36 crc kubenswrapper[4742]: I0317 11:31:36.685547 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rrnw9" event={"ID":"3cc5195f-ecc0-4f8e-bc53-ea602fff501d","Type":"ContainerStarted","Data":"8ec4a550266ce3278a1377fb8af8b462d79a152e45f3fb9ae70c819ea211977b"} Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.021554 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:37 crc kubenswrapper[4742]: E0317 11:31:37.021820 4742 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 11:31:37 crc kubenswrapper[4742]: E0317 11:31:37.021844 4742 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 11:31:37 crc kubenswrapper[4742]: E0317 11:31:37.021966 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift podName:be22c821-2e25-47ed-938d-c748fc55a4c6 nodeName:}" failed. No retries permitted until 2026-03-17 11:31:39.021893284 +0000 UTC m=+1202.148021052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift") pod "swift-storage-0" (UID: "be22c821-2e25-47ed-938d-c748fc55a4c6") : configmap "swift-ring-files" not found Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.065676 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-cqlsx"] Mar 17 11:31:37 crc kubenswrapper[4742]: E0317 11:31:37.066348 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d84251cb-eea2-41f7-b743-ab3a4d0c4ae1" containerName="mariadb-account-create-update" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.066394 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d84251cb-eea2-41f7-b743-ab3a4d0c4ae1" containerName="mariadb-account-create-update" Mar 17 11:31:37 crc kubenswrapper[4742]: E0317 11:31:37.066446 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28db6ae8-7bd6-4c58-9b49-7349349da904" containerName="mariadb-database-create" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.066465 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="28db6ae8-7bd6-4c58-9b49-7349349da904" containerName="mariadb-database-create" Mar 17 11:31:37 crc kubenswrapper[4742]: E0317 11:31:37.066514 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d80d7e30-5242-48a7-b61b-6e3d74364128" containerName="mariadb-database-create" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.066530 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d80d7e30-5242-48a7-b61b-6e3d74364128" containerName="mariadb-database-create" Mar 17 11:31:37 crc kubenswrapper[4742]: E0317 11:31:37.066597 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19a4cab3-fe96-424d-a768-741b2c01d8e0" containerName="mariadb-account-create-update" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.066617 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="19a4cab3-fe96-424d-a768-741b2c01d8e0" containerName="mariadb-account-create-update" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.067528 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="19a4cab3-fe96-424d-a768-741b2c01d8e0" containerName="mariadb-account-create-update" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.067586 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d80d7e30-5242-48a7-b61b-6e3d74364128" containerName="mariadb-database-create" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.067611 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d84251cb-eea2-41f7-b743-ab3a4d0c4ae1" containerName="mariadb-account-create-update" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.067636 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="28db6ae8-7bd6-4c58-9b49-7349349da904" containerName="mariadb-database-create" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.068535 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cqlsx" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.074336 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cqlsx"] Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.124318 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9lmf\" (UniqueName: \"kubernetes.io/projected/86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f-kube-api-access-r9lmf\") pod \"glance-db-create-cqlsx\" (UID: \"86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f\") " pod="openstack/glance-db-create-cqlsx" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.124524 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f-operator-scripts\") pod \"glance-db-create-cqlsx\" (UID: \"86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f\") " pod="openstack/glance-db-create-cqlsx" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.175155 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8f69-account-create-update-pwhwq"] Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.176214 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8f69-account-create-update-pwhwq" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.183428 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.197397 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8f69-account-create-update-pwhwq"] Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.253423 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4frc\" (UniqueName: \"kubernetes.io/projected/308384f8-4874-467d-92e9-d5078d3017b5-kube-api-access-s4frc\") pod \"glance-8f69-account-create-update-pwhwq\" (UID: \"308384f8-4874-467d-92e9-d5078d3017b5\") " pod="openstack/glance-8f69-account-create-update-pwhwq" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.253593 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9lmf\" (UniqueName: \"kubernetes.io/projected/86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f-kube-api-access-r9lmf\") pod \"glance-db-create-cqlsx\" (UID: \"86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f\") " pod="openstack/glance-db-create-cqlsx" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.253629 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f-operator-scripts\") pod \"glance-db-create-cqlsx\" (UID: \"86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f\") " pod="openstack/glance-db-create-cqlsx" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.253678 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/308384f8-4874-467d-92e9-d5078d3017b5-operator-scripts\") pod \"glance-8f69-account-create-update-pwhwq\" (UID: \"308384f8-4874-467d-92e9-d5078d3017b5\") " pod="openstack/glance-8f69-account-create-update-pwhwq" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.255360 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f-operator-scripts\") pod \"glance-db-create-cqlsx\" (UID: \"86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f\") " pod="openstack/glance-db-create-cqlsx" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.275471 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9lmf\" (UniqueName: \"kubernetes.io/projected/86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f-kube-api-access-r9lmf\") pod \"glance-db-create-cqlsx\" (UID: \"86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f\") " pod="openstack/glance-db-create-cqlsx" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.354947 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/308384f8-4874-467d-92e9-d5078d3017b5-operator-scripts\") pod \"glance-8f69-account-create-update-pwhwq\" (UID: \"308384f8-4874-467d-92e9-d5078d3017b5\") " pod="openstack/glance-8f69-account-create-update-pwhwq" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.355023 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4frc\" (UniqueName: \"kubernetes.io/projected/308384f8-4874-467d-92e9-d5078d3017b5-kube-api-access-s4frc\") pod \"glance-8f69-account-create-update-pwhwq\" (UID: \"308384f8-4874-467d-92e9-d5078d3017b5\") " pod="openstack/glance-8f69-account-create-update-pwhwq" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.355641 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/308384f8-4874-467d-92e9-d5078d3017b5-operator-scripts\") pod \"glance-8f69-account-create-update-pwhwq\" (UID: \"308384f8-4874-467d-92e9-d5078d3017b5\") " pod="openstack/glance-8f69-account-create-update-pwhwq" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.369874 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4frc\" (UniqueName: \"kubernetes.io/projected/308384f8-4874-467d-92e9-d5078d3017b5-kube-api-access-s4frc\") pod \"glance-8f69-account-create-update-pwhwq\" (UID: \"308384f8-4874-467d-92e9-d5078d3017b5\") " pod="openstack/glance-8f69-account-create-update-pwhwq" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.388766 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cqlsx" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.582304 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8f69-account-create-update-pwhwq" Mar 17 11:31:37 crc kubenswrapper[4742]: I0317 11:31:37.845517 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cqlsx"] Mar 17 11:31:37 crc kubenswrapper[4742]: W0317 11:31:37.853029 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86c59a5d_73d4_45e9_bb2f_cbf9fa687b6f.slice/crio-ba5ce8ad369caaf644b6ea51cd74b2b1c9a1a35a3781692e7183995cada473e7 WatchSource:0}: Error finding container ba5ce8ad369caaf644b6ea51cd74b2b1c9a1a35a3781692e7183995cada473e7: Status 404 returned error can't find the container with id ba5ce8ad369caaf644b6ea51cd74b2b1c9a1a35a3781692e7183995cada473e7 Mar 17 11:31:38 crc kubenswrapper[4742]: I0317 11:31:38.025467 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8f69-account-create-update-pwhwq"] Mar 17 11:31:38 crc kubenswrapper[4742]: W0317 11:31:38.032568 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod308384f8_4874_467d_92e9_d5078d3017b5.slice/crio-c2ee54ec27a500e02412eb2f954f10f32dea20943ca21748de49c7ffdb737f57 WatchSource:0}: Error finding container c2ee54ec27a500e02412eb2f954f10f32dea20943ca21748de49c7ffdb737f57: Status 404 returned error can't find the container with id c2ee54ec27a500e02412eb2f954f10f32dea20943ca21748de49c7ffdb737f57 Mar 17 11:31:38 crc kubenswrapper[4742]: I0317 11:31:38.710321 4742 generic.go:334] "Generic (PLEG): container finished" podID="86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f" containerID="d3abbd19ee12b5bda3502280acd949d13d0a02256a13414347d7f4740c40d154" exitCode=0 Mar 17 11:31:38 crc kubenswrapper[4742]: I0317 11:31:38.710682 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cqlsx" event={"ID":"86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f","Type":"ContainerDied","Data":"d3abbd19ee12b5bda3502280acd949d13d0a02256a13414347d7f4740c40d154"} Mar 17 11:31:38 crc kubenswrapper[4742]: I0317 11:31:38.710707 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cqlsx" event={"ID":"86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f","Type":"ContainerStarted","Data":"ba5ce8ad369caaf644b6ea51cd74b2b1c9a1a35a3781692e7183995cada473e7"} Mar 17 11:31:38 crc kubenswrapper[4742]: I0317 11:31:38.715502 4742 generic.go:334] "Generic (PLEG): container finished" podID="308384f8-4874-467d-92e9-d5078d3017b5" containerID="af718105c77fc34a33fccece18d4b68331853b7c0e36f08268c37e336bcf5dcd" exitCode=0 Mar 17 11:31:38 crc kubenswrapper[4742]: I0317 11:31:38.715548 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8f69-account-create-update-pwhwq" event={"ID":"308384f8-4874-467d-92e9-d5078d3017b5","Type":"ContainerDied","Data":"af718105c77fc34a33fccece18d4b68331853b7c0e36f08268c37e336bcf5dcd"} Mar 17 11:31:38 crc kubenswrapper[4742]: I0317 11:31:38.715574 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8f69-account-create-update-pwhwq" event={"ID":"308384f8-4874-467d-92e9-d5078d3017b5","Type":"ContainerStarted","Data":"c2ee54ec27a500e02412eb2f954f10f32dea20943ca21748de49c7ffdb737f57"} Mar 17 11:31:39 crc kubenswrapper[4742]: I0317 11:31:39.011275 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7w86s"] Mar 17 11:31:39 crc kubenswrapper[4742]: I0317 11:31:39.012739 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7w86s" Mar 17 11:31:39 crc kubenswrapper[4742]: I0317 11:31:39.018245 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 17 11:31:39 crc kubenswrapper[4742]: I0317 11:31:39.022309 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7w86s"] Mar 17 11:31:39 crc kubenswrapper[4742]: I0317 11:31:39.083001 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:39 crc kubenswrapper[4742]: I0317 11:31:39.084135 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f298cea0-0d19-4e92-9048-24fda7329b88-operator-scripts\") pod \"root-account-create-update-7w86s\" (UID: \"f298cea0-0d19-4e92-9048-24fda7329b88\") " pod="openstack/root-account-create-update-7w86s" Mar 17 11:31:39 crc kubenswrapper[4742]: I0317 11:31:39.084265 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzmxt\" (UniqueName: \"kubernetes.io/projected/f298cea0-0d19-4e92-9048-24fda7329b88-kube-api-access-vzmxt\") pod \"root-account-create-update-7w86s\" (UID: \"f298cea0-0d19-4e92-9048-24fda7329b88\") " pod="openstack/root-account-create-update-7w86s" Mar 17 11:31:39 crc kubenswrapper[4742]: E0317 11:31:39.083261 4742 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 11:31:39 crc kubenswrapper[4742]: E0317 11:31:39.084560 4742 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 11:31:39 crc kubenswrapper[4742]: E0317 11:31:39.084674 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift podName:be22c821-2e25-47ed-938d-c748fc55a4c6 nodeName:}" failed. No retries permitted until 2026-03-17 11:31:43.084658462 +0000 UTC m=+1206.210786220 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift") pod "swift-storage-0" (UID: "be22c821-2e25-47ed-938d-c748fc55a4c6") : configmap "swift-ring-files" not found Mar 17 11:31:39 crc kubenswrapper[4742]: I0317 11:31:39.185991 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f298cea0-0d19-4e92-9048-24fda7329b88-operator-scripts\") pod \"root-account-create-update-7w86s\" (UID: \"f298cea0-0d19-4e92-9048-24fda7329b88\") " pod="openstack/root-account-create-update-7w86s" Mar 17 11:31:39 crc kubenswrapper[4742]: I0317 11:31:39.189210 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzmxt\" (UniqueName: \"kubernetes.io/projected/f298cea0-0d19-4e92-9048-24fda7329b88-kube-api-access-vzmxt\") pod \"root-account-create-update-7w86s\" (UID: \"f298cea0-0d19-4e92-9048-24fda7329b88\") " pod="openstack/root-account-create-update-7w86s" Mar 17 11:31:39 crc kubenswrapper[4742]: I0317 11:31:39.190698 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f298cea0-0d19-4e92-9048-24fda7329b88-operator-scripts\") pod \"root-account-create-update-7w86s\" (UID: \"f298cea0-0d19-4e92-9048-24fda7329b88\") " pod="openstack/root-account-create-update-7w86s" Mar 17 11:31:39 crc kubenswrapper[4742]: I0317 11:31:39.212626 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzmxt\" (UniqueName: \"kubernetes.io/projected/f298cea0-0d19-4e92-9048-24fda7329b88-kube-api-access-vzmxt\") pod \"root-account-create-update-7w86s\" (UID: \"f298cea0-0d19-4e92-9048-24fda7329b88\") " pod="openstack/root-account-create-update-7w86s" Mar 17 11:31:39 crc kubenswrapper[4742]: I0317 11:31:39.338290 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7w86s" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.578017 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cqlsx" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.580888 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8f69-account-create-update-pwhwq" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.724633 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9lmf\" (UniqueName: \"kubernetes.io/projected/86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f-kube-api-access-r9lmf\") pod \"86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f\" (UID: \"86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f\") " Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.724902 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f-operator-scripts\") pod \"86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f\" (UID: \"86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f\") " Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.725039 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4frc\" (UniqueName: \"kubernetes.io/projected/308384f8-4874-467d-92e9-d5078d3017b5-kube-api-access-s4frc\") pod \"308384f8-4874-467d-92e9-d5078d3017b5\" (UID: \"308384f8-4874-467d-92e9-d5078d3017b5\") " Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.725116 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/308384f8-4874-467d-92e9-d5078d3017b5-operator-scripts\") pod \"308384f8-4874-467d-92e9-d5078d3017b5\" (UID: \"308384f8-4874-467d-92e9-d5078d3017b5\") " Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.725548 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f" (UID: "86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.725942 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/308384f8-4874-467d-92e9-d5078d3017b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "308384f8-4874-467d-92e9-d5078d3017b5" (UID: "308384f8-4874-467d-92e9-d5078d3017b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.731067 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f-kube-api-access-r9lmf" (OuterVolumeSpecName: "kube-api-access-r9lmf") pod "86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f" (UID: "86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f"). InnerVolumeSpecName "kube-api-access-r9lmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.731205 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308384f8-4874-467d-92e9-d5078d3017b5-kube-api-access-s4frc" (OuterVolumeSpecName: "kube-api-access-s4frc") pod "308384f8-4874-467d-92e9-d5078d3017b5" (UID: "308384f8-4874-467d-92e9-d5078d3017b5"). InnerVolumeSpecName "kube-api-access-s4frc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.732592 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cqlsx" event={"ID":"86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f","Type":"ContainerDied","Data":"ba5ce8ad369caaf644b6ea51cd74b2b1c9a1a35a3781692e7183995cada473e7"} Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.732617 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cqlsx" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.732649 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba5ce8ad369caaf644b6ea51cd74b2b1c9a1a35a3781692e7183995cada473e7" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.735799 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8f69-account-create-update-pwhwq" event={"ID":"308384f8-4874-467d-92e9-d5078d3017b5","Type":"ContainerDied","Data":"c2ee54ec27a500e02412eb2f954f10f32dea20943ca21748de49c7ffdb737f57"} Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.735903 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2ee54ec27a500e02412eb2f954f10f32dea20943ca21748de49c7ffdb737f57" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.735857 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8f69-account-create-update-pwhwq" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.827060 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9lmf\" (UniqueName: \"kubernetes.io/projected/86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f-kube-api-access-r9lmf\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.827087 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.827097 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4frc\" (UniqueName: \"kubernetes.io/projected/308384f8-4874-467d-92e9-d5078d3017b5-kube-api-access-s4frc\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.827105 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/308384f8-4874-467d-92e9-d5078d3017b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:40 crc kubenswrapper[4742]: W0317 11:31:40.945873 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf298cea0_0d19_4e92_9048_24fda7329b88.slice/crio-fc5e1a7dffdc897b489cfea39ea455ce23fe331e2c37712728b8ea732cf555c7 WatchSource:0}: Error finding container fc5e1a7dffdc897b489cfea39ea455ce23fe331e2c37712728b8ea732cf555c7: Status 404 returned error can't find the container with id fc5e1a7dffdc897b489cfea39ea455ce23fe331e2c37712728b8ea732cf555c7 Mar 17 11:31:40 crc kubenswrapper[4742]: I0317 11:31:40.949549 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7w86s"] Mar 17 11:31:41 crc kubenswrapper[4742]: I0317 11:31:41.744210 4742 generic.go:334] "Generic (PLEG): container finished" podID="f298cea0-0d19-4e92-9048-24fda7329b88" containerID="b2e577c301d5b5bcddc3c97fd98c17d9bed76ad4ca32102a18078d5fafa190e6" exitCode=0 Mar 17 11:31:41 crc kubenswrapper[4742]: I0317 11:31:41.744259 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7w86s" event={"ID":"f298cea0-0d19-4e92-9048-24fda7329b88","Type":"ContainerDied","Data":"b2e577c301d5b5bcddc3c97fd98c17d9bed76ad4ca32102a18078d5fafa190e6"} Mar 17 11:31:41 crc kubenswrapper[4742]: I0317 11:31:41.744470 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7w86s" event={"ID":"f298cea0-0d19-4e92-9048-24fda7329b88","Type":"ContainerStarted","Data":"fc5e1a7dffdc897b489cfea39ea455ce23fe331e2c37712728b8ea732cf555c7"} Mar 17 11:31:41 crc kubenswrapper[4742]: I0317 11:31:41.746337 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rrnw9" event={"ID":"3cc5195f-ecc0-4f8e-bc53-ea602fff501d","Type":"ContainerStarted","Data":"86483a6b73f108b854b26b8c900123bc350da5883da9d0518d6294473724d8ce"} Mar 17 11:31:41 crc kubenswrapper[4742]: I0317 11:31:41.787730 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-rrnw9" podStartSLOduration=2.929112172 podStartE2EDuration="6.787710774s" podCreationTimestamp="2026-03-17 11:31:35 +0000 UTC" firstStartedPulling="2026-03-17 11:31:36.626341855 +0000 UTC m=+1199.752469623" lastFinishedPulling="2026-03-17 11:31:40.484940467 +0000 UTC m=+1203.611068225" observedRunningTime="2026-03-17 11:31:41.784714045 +0000 UTC m=+1204.910841813" watchObservedRunningTime="2026-03-17 11:31:41.787710774 +0000 UTC m=+1204.913838552" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.324537 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-6sr2t"] Mar 17 11:31:42 crc kubenswrapper[4742]: E0317 11:31:42.324869 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f" containerName="mariadb-database-create" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.324884 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f" containerName="mariadb-database-create" Mar 17 11:31:42 crc kubenswrapper[4742]: E0317 11:31:42.324938 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="308384f8-4874-467d-92e9-d5078d3017b5" containerName="mariadb-account-create-update" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.324945 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="308384f8-4874-467d-92e9-d5078d3017b5" containerName="mariadb-account-create-update" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.325086 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f" containerName="mariadb-database-create" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.325105 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="308384f8-4874-467d-92e9-d5078d3017b5" containerName="mariadb-account-create-update" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.325564 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.327667 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.328213 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rh74z" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.377124 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6sr2t"] Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.458175 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-config-data\") pod \"glance-db-sync-6sr2t\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.458336 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-combined-ca-bundle\") pod \"glance-db-sync-6sr2t\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.458399 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltvl6\" (UniqueName: \"kubernetes.io/projected/828f87ee-72a4-43e5-88f7-0a15975b90a5-kube-api-access-ltvl6\") pod \"glance-db-sync-6sr2t\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.458455 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-db-sync-config-data\") pod \"glance-db-sync-6sr2t\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.561065 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-db-sync-config-data\") pod \"glance-db-sync-6sr2t\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.561408 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-config-data\") pod \"glance-db-sync-6sr2t\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.561601 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-combined-ca-bundle\") pod \"glance-db-sync-6sr2t\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.561690 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltvl6\" (UniqueName: \"kubernetes.io/projected/828f87ee-72a4-43e5-88f7-0a15975b90a5-kube-api-access-ltvl6\") pod \"glance-db-sync-6sr2t\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.570010 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-config-data\") pod \"glance-db-sync-6sr2t\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.571349 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-db-sync-config-data\") pod \"glance-db-sync-6sr2t\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.571985 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-combined-ca-bundle\") pod \"glance-db-sync-6sr2t\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.581849 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltvl6\" (UniqueName: \"kubernetes.io/projected/828f87ee-72a4-43e5-88f7-0a15975b90a5-kube-api-access-ltvl6\") pod \"glance-db-sync-6sr2t\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:42 crc kubenswrapper[4742]: I0317 11:31:42.660882 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6sr2t" Mar 17 11:31:43 crc kubenswrapper[4742]: I0317 11:31:43.148151 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7w86s" Mar 17 11:31:43 crc kubenswrapper[4742]: I0317 11:31:43.171815 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:43 crc kubenswrapper[4742]: E0317 11:31:43.171995 4742 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 17 11:31:43 crc kubenswrapper[4742]: E0317 11:31:43.172010 4742 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 17 11:31:43 crc kubenswrapper[4742]: E0317 11:31:43.172058 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift podName:be22c821-2e25-47ed-938d-c748fc55a4c6 nodeName:}" failed. No retries permitted until 2026-03-17 11:31:51.172044537 +0000 UTC m=+1214.298172295 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift") pod "swift-storage-0" (UID: "be22c821-2e25-47ed-938d-c748fc55a4c6") : configmap "swift-ring-files" not found Mar 17 11:31:43 crc kubenswrapper[4742]: I0317 11:31:43.207579 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6sr2t"] Mar 17 11:31:43 crc kubenswrapper[4742]: I0317 11:31:43.272807 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzmxt\" (UniqueName: \"kubernetes.io/projected/f298cea0-0d19-4e92-9048-24fda7329b88-kube-api-access-vzmxt\") pod \"f298cea0-0d19-4e92-9048-24fda7329b88\" (UID: \"f298cea0-0d19-4e92-9048-24fda7329b88\") " Mar 17 11:31:43 crc kubenswrapper[4742]: I0317 11:31:43.272938 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f298cea0-0d19-4e92-9048-24fda7329b88-operator-scripts\") pod \"f298cea0-0d19-4e92-9048-24fda7329b88\" (UID: \"f298cea0-0d19-4e92-9048-24fda7329b88\") " Mar 17 11:31:43 crc kubenswrapper[4742]: I0317 11:31:43.274079 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f298cea0-0d19-4e92-9048-24fda7329b88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f298cea0-0d19-4e92-9048-24fda7329b88" (UID: "f298cea0-0d19-4e92-9048-24fda7329b88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:43 crc kubenswrapper[4742]: I0317 11:31:43.277359 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f298cea0-0d19-4e92-9048-24fda7329b88-kube-api-access-vzmxt" (OuterVolumeSpecName: "kube-api-access-vzmxt") pod "f298cea0-0d19-4e92-9048-24fda7329b88" (UID: "f298cea0-0d19-4e92-9048-24fda7329b88"). InnerVolumeSpecName "kube-api-access-vzmxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:43 crc kubenswrapper[4742]: I0317 11:31:43.375874 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzmxt\" (UniqueName: \"kubernetes.io/projected/f298cea0-0d19-4e92-9048-24fda7329b88-kube-api-access-vzmxt\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:43 crc kubenswrapper[4742]: I0317 11:31:43.375961 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f298cea0-0d19-4e92-9048-24fda7329b88-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:43 crc kubenswrapper[4742]: I0317 11:31:43.782103 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6sr2t" event={"ID":"828f87ee-72a4-43e5-88f7-0a15975b90a5","Type":"ContainerStarted","Data":"42e7e7f6ead65811bd03dfa93f7c9130753d82807fe03b45caa667f47a6dc3c1"} Mar 17 11:31:43 crc kubenswrapper[4742]: I0317 11:31:43.784112 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7w86s" event={"ID":"f298cea0-0d19-4e92-9048-24fda7329b88","Type":"ContainerDied","Data":"fc5e1a7dffdc897b489cfea39ea455ce23fe331e2c37712728b8ea732cf555c7"} Mar 17 11:31:43 crc kubenswrapper[4742]: I0317 11:31:43.784146 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7w86s" Mar 17 11:31:43 crc kubenswrapper[4742]: I0317 11:31:43.784150 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc5e1a7dffdc897b489cfea39ea455ce23fe331e2c37712728b8ea732cf555c7" Mar 17 11:31:44 crc kubenswrapper[4742]: I0317 11:31:44.642698 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:31:44 crc kubenswrapper[4742]: I0317 11:31:44.702648 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7d25z"] Mar 17 11:31:44 crc kubenswrapper[4742]: I0317 11:31:44.702936 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" podUID="3a91260a-abb6-4e26-b041-c39b36369405" containerName="dnsmasq-dns" containerID="cri-o://b55f0ae81d3c4dbb89d3d082ff9141409db2a32120c416f78cb9b535b2a1e22c" gracePeriod=10 Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.258417 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.320287 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-config\") pod \"3a91260a-abb6-4e26-b041-c39b36369405\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.320362 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-ovsdbserver-sb\") pod \"3a91260a-abb6-4e26-b041-c39b36369405\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.320420 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-ovsdbserver-nb\") pod \"3a91260a-abb6-4e26-b041-c39b36369405\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.320474 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-dns-svc\") pod \"3a91260a-abb6-4e26-b041-c39b36369405\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.320500 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgd2\" (UniqueName: \"kubernetes.io/projected/3a91260a-abb6-4e26-b041-c39b36369405-kube-api-access-xcgd2\") pod \"3a91260a-abb6-4e26-b041-c39b36369405\" (UID: \"3a91260a-abb6-4e26-b041-c39b36369405\") " Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.334139 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7w86s"] Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.335400 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a91260a-abb6-4e26-b041-c39b36369405-kube-api-access-xcgd2" (OuterVolumeSpecName: "kube-api-access-xcgd2") pod "3a91260a-abb6-4e26-b041-c39b36369405" (UID: "3a91260a-abb6-4e26-b041-c39b36369405"). InnerVolumeSpecName "kube-api-access-xcgd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.347923 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7w86s"] Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.383815 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-config" (OuterVolumeSpecName: "config") pod "3a91260a-abb6-4e26-b041-c39b36369405" (UID: "3a91260a-abb6-4e26-b041-c39b36369405"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.395474 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3a91260a-abb6-4e26-b041-c39b36369405" (UID: "3a91260a-abb6-4e26-b041-c39b36369405"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.419488 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a91260a-abb6-4e26-b041-c39b36369405" (UID: "3a91260a-abb6-4e26-b041-c39b36369405"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.421733 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.421753 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.421763 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.421771 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgd2\" (UniqueName: \"kubernetes.io/projected/3a91260a-abb6-4e26-b041-c39b36369405-kube-api-access-xcgd2\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.431260 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3a91260a-abb6-4e26-b041-c39b36369405" (UID: "3a91260a-abb6-4e26-b041-c39b36369405"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.523091 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a91260a-abb6-4e26-b041-c39b36369405-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.801429 4742 generic.go:334] "Generic (PLEG): container finished" podID="3a91260a-abb6-4e26-b041-c39b36369405" containerID="b55f0ae81d3c4dbb89d3d082ff9141409db2a32120c416f78cb9b535b2a1e22c" exitCode=0 Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.801526 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.801552 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" event={"ID":"3a91260a-abb6-4e26-b041-c39b36369405","Type":"ContainerDied","Data":"b55f0ae81d3c4dbb89d3d082ff9141409db2a32120c416f78cb9b535b2a1e22c"} Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.801928 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7d25z" event={"ID":"3a91260a-abb6-4e26-b041-c39b36369405","Type":"ContainerDied","Data":"25dd5612e6aae3591d98a70b3b5612c98bedc9989e5b0e013aa3e331619ea2ee"} Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.801956 4742 scope.go:117] "RemoveContainer" containerID="b55f0ae81d3c4dbb89d3d082ff9141409db2a32120c416f78cb9b535b2a1e22c" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.843900 4742 scope.go:117] "RemoveContainer" containerID="f9239c89d4287fee5ffddf3435fae18b180b27f99abeacbf6b8fbe5c62625272" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.884075 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7d25z"] Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.893871 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7d25z"] Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.910827 4742 scope.go:117] "RemoveContainer" containerID="b55f0ae81d3c4dbb89d3d082ff9141409db2a32120c416f78cb9b535b2a1e22c" Mar 17 11:31:45 crc kubenswrapper[4742]: E0317 11:31:45.911404 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55f0ae81d3c4dbb89d3d082ff9141409db2a32120c416f78cb9b535b2a1e22c\": container with ID starting with b55f0ae81d3c4dbb89d3d082ff9141409db2a32120c416f78cb9b535b2a1e22c not found: ID does not exist" containerID="b55f0ae81d3c4dbb89d3d082ff9141409db2a32120c416f78cb9b535b2a1e22c" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.911447 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55f0ae81d3c4dbb89d3d082ff9141409db2a32120c416f78cb9b535b2a1e22c"} err="failed to get container status \"b55f0ae81d3c4dbb89d3d082ff9141409db2a32120c416f78cb9b535b2a1e22c\": rpc error: code = NotFound desc = could not find container \"b55f0ae81d3c4dbb89d3d082ff9141409db2a32120c416f78cb9b535b2a1e22c\": container with ID starting with b55f0ae81d3c4dbb89d3d082ff9141409db2a32120c416f78cb9b535b2a1e22c not found: ID does not exist" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.911473 4742 scope.go:117] "RemoveContainer" containerID="f9239c89d4287fee5ffddf3435fae18b180b27f99abeacbf6b8fbe5c62625272" Mar 17 11:31:45 crc kubenswrapper[4742]: E0317 11:31:45.911819 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9239c89d4287fee5ffddf3435fae18b180b27f99abeacbf6b8fbe5c62625272\": container with ID starting with f9239c89d4287fee5ffddf3435fae18b180b27f99abeacbf6b8fbe5c62625272 not found: ID does not exist" containerID="f9239c89d4287fee5ffddf3435fae18b180b27f99abeacbf6b8fbe5c62625272" Mar 17 11:31:45 crc kubenswrapper[4742]: I0317 11:31:45.911851 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9239c89d4287fee5ffddf3435fae18b180b27f99abeacbf6b8fbe5c62625272"} err="failed to get container status \"f9239c89d4287fee5ffddf3435fae18b180b27f99abeacbf6b8fbe5c62625272\": rpc error: code = NotFound desc = could not find container \"f9239c89d4287fee5ffddf3435fae18b180b27f99abeacbf6b8fbe5c62625272\": container with ID starting with f9239c89d4287fee5ffddf3435fae18b180b27f99abeacbf6b8fbe5c62625272 not found: ID does not exist" Mar 17 11:31:46 crc kubenswrapper[4742]: I0317 11:31:46.189563 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 17 11:31:46 crc kubenswrapper[4742]: I0317 11:31:46.671370 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a91260a-abb6-4e26-b041-c39b36369405" path="/var/lib/kubelet/pods/3a91260a-abb6-4e26-b041-c39b36369405/volumes" Mar 17 11:31:46 crc kubenswrapper[4742]: I0317 11:31:46.672405 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f298cea0-0d19-4e92-9048-24fda7329b88" path="/var/lib/kubelet/pods/f298cea0-0d19-4e92-9048-24fda7329b88/volumes" Mar 17 11:31:47 crc kubenswrapper[4742]: I0317 11:31:47.820376 4742 generic.go:334] "Generic (PLEG): container finished" podID="3cc5195f-ecc0-4f8e-bc53-ea602fff501d" containerID="86483a6b73f108b854b26b8c900123bc350da5883da9d0518d6294473724d8ce" exitCode=0 Mar 17 11:31:47 crc kubenswrapper[4742]: I0317 11:31:47.821052 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rrnw9" event={"ID":"3cc5195f-ecc0-4f8e-bc53-ea602fff501d","Type":"ContainerDied","Data":"86483a6b73f108b854b26b8c900123bc350da5883da9d0518d6294473724d8ce"} Mar 17 11:31:48 crc kubenswrapper[4742]: I0317 11:31:48.043975 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:31:48 crc kubenswrapper[4742]: I0317 11:31:48.044057 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.042157 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jt6xb"] Mar 17 11:31:49 crc kubenswrapper[4742]: E0317 11:31:49.042545 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f298cea0-0d19-4e92-9048-24fda7329b88" containerName="mariadb-account-create-update" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.042560 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f298cea0-0d19-4e92-9048-24fda7329b88" containerName="mariadb-account-create-update" Mar 17 11:31:49 crc kubenswrapper[4742]: E0317 11:31:49.042573 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a91260a-abb6-4e26-b041-c39b36369405" containerName="dnsmasq-dns" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.042579 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a91260a-abb6-4e26-b041-c39b36369405" containerName="dnsmasq-dns" Mar 17 11:31:49 crc kubenswrapper[4742]: E0317 11:31:49.042591 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a91260a-abb6-4e26-b041-c39b36369405" containerName="init" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.042598 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a91260a-abb6-4e26-b041-c39b36369405" containerName="init" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.042750 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a91260a-abb6-4e26-b041-c39b36369405" containerName="dnsmasq-dns" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.042775 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f298cea0-0d19-4e92-9048-24fda7329b88" containerName="mariadb-account-create-update" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.043332 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jt6xb" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.046123 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.050445 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jt6xb"] Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.136350 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.215727 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-combined-ca-bundle\") pod \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.216020 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-scripts\") pod \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.216173 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-swiftconf\") pod \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.216279 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-ring-data-devices\") pod \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.216405 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-etc-swift\") pod \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.216516 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx4ql\" (UniqueName: \"kubernetes.io/projected/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-kube-api-access-fx4ql\") pod \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.216641 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-dispersionconf\") pod \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\" (UID: \"3cc5195f-ecc0-4f8e-bc53-ea602fff501d\") " Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.216948 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk98j\" (UniqueName: \"kubernetes.io/projected/7e711c48-d9a0-4fd4-8d8d-734cb315d3e8-kube-api-access-pk98j\") pod \"root-account-create-update-jt6xb\" (UID: \"7e711c48-d9a0-4fd4-8d8d-734cb315d3e8\") " pod="openstack/root-account-create-update-jt6xb" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.217019 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3cc5195f-ecc0-4f8e-bc53-ea602fff501d" (UID: "3cc5195f-ecc0-4f8e-bc53-ea602fff501d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.217249 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e711c48-d9a0-4fd4-8d8d-734cb315d3e8-operator-scripts\") pod \"root-account-create-update-jt6xb\" (UID: \"7e711c48-d9a0-4fd4-8d8d-734cb315d3e8\") " pod="openstack/root-account-create-update-jt6xb" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.217317 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3cc5195f-ecc0-4f8e-bc53-ea602fff501d" (UID: "3cc5195f-ecc0-4f8e-bc53-ea602fff501d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.217659 4742 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.217736 4742 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.226808 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-kube-api-access-fx4ql" (OuterVolumeSpecName: "kube-api-access-fx4ql") pod "3cc5195f-ecc0-4f8e-bc53-ea602fff501d" (UID: "3cc5195f-ecc0-4f8e-bc53-ea602fff501d"). InnerVolumeSpecName "kube-api-access-fx4ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.230635 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3cc5195f-ecc0-4f8e-bc53-ea602fff501d" (UID: "3cc5195f-ecc0-4f8e-bc53-ea602fff501d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.238028 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cc5195f-ecc0-4f8e-bc53-ea602fff501d" (UID: "3cc5195f-ecc0-4f8e-bc53-ea602fff501d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.238081 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-scripts" (OuterVolumeSpecName: "scripts") pod "3cc5195f-ecc0-4f8e-bc53-ea602fff501d" (UID: "3cc5195f-ecc0-4f8e-bc53-ea602fff501d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.239130 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3cc5195f-ecc0-4f8e-bc53-ea602fff501d" (UID: "3cc5195f-ecc0-4f8e-bc53-ea602fff501d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.320217 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk98j\" (UniqueName: \"kubernetes.io/projected/7e711c48-d9a0-4fd4-8d8d-734cb315d3e8-kube-api-access-pk98j\") pod \"root-account-create-update-jt6xb\" (UID: \"7e711c48-d9a0-4fd4-8d8d-734cb315d3e8\") " pod="openstack/root-account-create-update-jt6xb" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.320375 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e711c48-d9a0-4fd4-8d8d-734cb315d3e8-operator-scripts\") pod \"root-account-create-update-jt6xb\" (UID: \"7e711c48-d9a0-4fd4-8d8d-734cb315d3e8\") " pod="openstack/root-account-create-update-jt6xb" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.320514 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx4ql\" (UniqueName: \"kubernetes.io/projected/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-kube-api-access-fx4ql\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.320536 4742 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.320555 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.320574 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.320589 4742 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3cc5195f-ecc0-4f8e-bc53-ea602fff501d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.321982 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e711c48-d9a0-4fd4-8d8d-734cb315d3e8-operator-scripts\") pod \"root-account-create-update-jt6xb\" (UID: \"7e711c48-d9a0-4fd4-8d8d-734cb315d3e8\") " pod="openstack/root-account-create-update-jt6xb" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.339548 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk98j\" (UniqueName: \"kubernetes.io/projected/7e711c48-d9a0-4fd4-8d8d-734cb315d3e8-kube-api-access-pk98j\") pod \"root-account-create-update-jt6xb\" (UID: \"7e711c48-d9a0-4fd4-8d8d-734cb315d3e8\") " pod="openstack/root-account-create-update-jt6xb" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.364900 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jt6xb" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.836917 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rrnw9" event={"ID":"3cc5195f-ecc0-4f8e-bc53-ea602fff501d","Type":"ContainerDied","Data":"8ec4a550266ce3278a1377fb8af8b462d79a152e45f3fb9ae70c819ea211977b"} Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.837242 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ec4a550266ce3278a1377fb8af8b462d79a152e45f3fb9ae70c819ea211977b" Mar 17 11:31:49 crc kubenswrapper[4742]: I0317 11:31:49.836990 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rrnw9" Mar 17 11:31:51 crc kubenswrapper[4742]: I0317 11:31:51.252871 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:51 crc kubenswrapper[4742]: I0317 11:31:51.261094 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/be22c821-2e25-47ed-938d-c748fc55a4c6-etc-swift\") pod \"swift-storage-0\" (UID: \"be22c821-2e25-47ed-938d-c748fc55a4c6\") " pod="openstack/swift-storage-0" Mar 17 11:31:51 crc kubenswrapper[4742]: I0317 11:31:51.264876 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.200720 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4j5jz" podUID="158a0d7f-e22f-4f44-aca2-efb59ff90439" containerName="ovn-controller" probeResult="failure" output=< Mar 17 11:31:52 crc kubenswrapper[4742]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 17 11:31:52 crc kubenswrapper[4742]: > Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.313686 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.324134 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-dmqzv" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.541309 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4j5jz-config-vzbgt"] Mar 17 11:31:52 crc kubenswrapper[4742]: E0317 11:31:52.541769 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cc5195f-ecc0-4f8e-bc53-ea602fff501d" containerName="swift-ring-rebalance" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.541792 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc5195f-ecc0-4f8e-bc53-ea602fff501d" containerName="swift-ring-rebalance" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.542043 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cc5195f-ecc0-4f8e-bc53-ea602fff501d" containerName="swift-ring-rebalance" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.542929 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.548707 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4j5jz-config-vzbgt"] Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.550280 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.680791 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtkmx\" (UniqueName: \"kubernetes.io/projected/ab12177c-bfc5-40c8-9468-81096ef2ac5e-kube-api-access-wtkmx\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.680922 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-run-ovn\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.681036 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ab12177c-bfc5-40c8-9468-81096ef2ac5e-additional-scripts\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.681099 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-log-ovn\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.681185 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab12177c-bfc5-40c8-9468-81096ef2ac5e-scripts\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.681229 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-run\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.782759 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ab12177c-bfc5-40c8-9468-81096ef2ac5e-additional-scripts\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.782840 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-log-ovn\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.782929 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab12177c-bfc5-40c8-9468-81096ef2ac5e-scripts\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.782961 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-run\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.783047 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtkmx\" (UniqueName: \"kubernetes.io/projected/ab12177c-bfc5-40c8-9468-81096ef2ac5e-kube-api-access-wtkmx\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.783097 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-run-ovn\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.783149 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-log-ovn\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.783420 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-run\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.784624 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-run-ovn\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.785790 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ab12177c-bfc5-40c8-9468-81096ef2ac5e-additional-scripts\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.801059 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab12177c-bfc5-40c8-9468-81096ef2ac5e-scripts\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.803357 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtkmx\" (UniqueName: \"kubernetes.io/projected/ab12177c-bfc5-40c8-9468-81096ef2ac5e-kube-api-access-wtkmx\") pod \"ovn-controller-4j5jz-config-vzbgt\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:52 crc kubenswrapper[4742]: I0317 11:31:52.871960 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:56 crc kubenswrapper[4742]: I0317 11:31:56.302431 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jt6xb"] Mar 17 11:31:56 crc kubenswrapper[4742]: I0317 11:31:56.313260 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4j5jz-config-vzbgt"] Mar 17 11:31:56 crc kubenswrapper[4742]: W0317 11:31:56.330469 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e711c48_d9a0_4fd4_8d8d_734cb315d3e8.slice/crio-aa34f48d0c855e0a9d4088217e5b66be2be38c66be8695d2510b2f6fb4694c0f WatchSource:0}: Error finding container aa34f48d0c855e0a9d4088217e5b66be2be38c66be8695d2510b2f6fb4694c0f: Status 404 returned error can't find the container with id aa34f48d0c855e0a9d4088217e5b66be2be38c66be8695d2510b2f6fb4694c0f Mar 17 11:31:56 crc kubenswrapper[4742]: I0317 11:31:56.340207 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 17 11:31:56 crc kubenswrapper[4742]: W0317 11:31:56.346560 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe22c821_2e25_47ed_938d_c748fc55a4c6.slice/crio-a3d0ac34e5e6a3604d4d9be6583e1d1faaa7c76af425d9ead96d8594059724b6 WatchSource:0}: Error finding container a3d0ac34e5e6a3604d4d9be6583e1d1faaa7c76af425d9ead96d8594059724b6: Status 404 returned error can't find the container with id a3d0ac34e5e6a3604d4d9be6583e1d1faaa7c76af425d9ead96d8594059724b6 Mar 17 11:31:56 crc kubenswrapper[4742]: I0317 11:31:56.909764 4742 generic.go:334] "Generic (PLEG): container finished" podID="7e711c48-d9a0-4fd4-8d8d-734cb315d3e8" containerID="e2282d37b0f4321b59cd38125bab6b08fe6bc64fa6ae9994352065ed6574c832" exitCode=0 Mar 17 11:31:56 crc kubenswrapper[4742]: I0317 11:31:56.909829 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jt6xb" event={"ID":"7e711c48-d9a0-4fd4-8d8d-734cb315d3e8","Type":"ContainerDied","Data":"e2282d37b0f4321b59cd38125bab6b08fe6bc64fa6ae9994352065ed6574c832"} Mar 17 11:31:56 crc kubenswrapper[4742]: I0317 11:31:56.910063 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jt6xb" event={"ID":"7e711c48-d9a0-4fd4-8d8d-734cb315d3e8","Type":"ContainerStarted","Data":"aa34f48d0c855e0a9d4088217e5b66be2be38c66be8695d2510b2f6fb4694c0f"} Mar 17 11:31:56 crc kubenswrapper[4742]: I0317 11:31:56.911139 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"a3d0ac34e5e6a3604d4d9be6583e1d1faaa7c76af425d9ead96d8594059724b6"} Mar 17 11:31:56 crc kubenswrapper[4742]: I0317 11:31:56.912204 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6sr2t" event={"ID":"828f87ee-72a4-43e5-88f7-0a15975b90a5","Type":"ContainerStarted","Data":"2ba37516202552ef925c69fed60b69b58405874b3021fb7d38fb547f33098c55"} Mar 17 11:31:56 crc kubenswrapper[4742]: I0317 11:31:56.913208 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4j5jz-config-vzbgt" event={"ID":"ab12177c-bfc5-40c8-9468-81096ef2ac5e","Type":"ContainerStarted","Data":"edd0f9d20440eac5de4e8baf73493f31ba4d5d6aa7d7cc31b9cf148c15d9e47e"} Mar 17 11:31:56 crc kubenswrapper[4742]: I0317 11:31:56.913242 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4j5jz-config-vzbgt" event={"ID":"ab12177c-bfc5-40c8-9468-81096ef2ac5e","Type":"ContainerStarted","Data":"972f82ef12540da5ad6341d69a4e02541676fce086fafa9691bbdcc30157084c"} Mar 17 11:31:56 crc kubenswrapper[4742]: I0317 11:31:56.949015 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-6sr2t" podStartSLOduration=2.346327304 podStartE2EDuration="14.948997754s" podCreationTimestamp="2026-03-17 11:31:42 +0000 UTC" firstStartedPulling="2026-03-17 11:31:43.219360413 +0000 UTC m=+1206.345488171" lastFinishedPulling="2026-03-17 11:31:55.822030863 +0000 UTC m=+1218.948158621" observedRunningTime="2026-03-17 11:31:56.940067767 +0000 UTC m=+1220.066195535" watchObservedRunningTime="2026-03-17 11:31:56.948997754 +0000 UTC m=+1220.075125512" Mar 17 11:31:56 crc kubenswrapper[4742]: I0317 11:31:56.958943 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-4j5jz-config-vzbgt" podStartSLOduration=4.958927748 podStartE2EDuration="4.958927748s" podCreationTimestamp="2026-03-17 11:31:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:31:56.955961609 +0000 UTC m=+1220.082089377" watchObservedRunningTime="2026-03-17 11:31:56.958927748 +0000 UTC m=+1220.085055506" Mar 17 11:31:57 crc kubenswrapper[4742]: I0317 11:31:57.198356 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-4j5jz" Mar 17 11:31:57 crc kubenswrapper[4742]: I0317 11:31:57.923529 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"2efc36affed9552e2ff34c49266626a7c4e5f1a8af14c594cb12a5387d9ac85e"} Mar 17 11:31:57 crc kubenswrapper[4742]: I0317 11:31:57.924182 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"f54fdb2a059c3746ba6dab8dff773fad169f2bbf14beb3b0a9beeb3948886387"} Mar 17 11:31:57 crc kubenswrapper[4742]: I0317 11:31:57.926734 4742 generic.go:334] "Generic (PLEG): container finished" podID="dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" containerID="2b56274b6b78ca4e5410d6fa294dba941d61ff2a15e2f2b60bc50b901df2e13d" exitCode=0 Mar 17 11:31:57 crc kubenswrapper[4742]: I0317 11:31:57.926828 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6","Type":"ContainerDied","Data":"2b56274b6b78ca4e5410d6fa294dba941d61ff2a15e2f2b60bc50b901df2e13d"} Mar 17 11:31:57 crc kubenswrapper[4742]: I0317 11:31:57.928509 4742 generic.go:334] "Generic (PLEG): container finished" podID="ab12177c-bfc5-40c8-9468-81096ef2ac5e" containerID="edd0f9d20440eac5de4e8baf73493f31ba4d5d6aa7d7cc31b9cf148c15d9e47e" exitCode=0 Mar 17 11:31:57 crc kubenswrapper[4742]: I0317 11:31:57.928538 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4j5jz-config-vzbgt" event={"ID":"ab12177c-bfc5-40c8-9468-81096ef2ac5e","Type":"ContainerDied","Data":"edd0f9d20440eac5de4e8baf73493f31ba4d5d6aa7d7cc31b9cf148c15d9e47e"} Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.216392 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jt6xb" Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.274049 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk98j\" (UniqueName: \"kubernetes.io/projected/7e711c48-d9a0-4fd4-8d8d-734cb315d3e8-kube-api-access-pk98j\") pod \"7e711c48-d9a0-4fd4-8d8d-734cb315d3e8\" (UID: \"7e711c48-d9a0-4fd4-8d8d-734cb315d3e8\") " Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.274245 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e711c48-d9a0-4fd4-8d8d-734cb315d3e8-operator-scripts\") pod \"7e711c48-d9a0-4fd4-8d8d-734cb315d3e8\" (UID: \"7e711c48-d9a0-4fd4-8d8d-734cb315d3e8\") " Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.274789 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e711c48-d9a0-4fd4-8d8d-734cb315d3e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e711c48-d9a0-4fd4-8d8d-734cb315d3e8" (UID: "7e711c48-d9a0-4fd4-8d8d-734cb315d3e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.288369 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e711c48-d9a0-4fd4-8d8d-734cb315d3e8-kube-api-access-pk98j" (OuterVolumeSpecName: "kube-api-access-pk98j") pod "7e711c48-d9a0-4fd4-8d8d-734cb315d3e8" (UID: "7e711c48-d9a0-4fd4-8d8d-734cb315d3e8"). InnerVolumeSpecName "kube-api-access-pk98j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.376000 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e711c48-d9a0-4fd4-8d8d-734cb315d3e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.376033 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk98j\" (UniqueName: \"kubernetes.io/projected/7e711c48-d9a0-4fd4-8d8d-734cb315d3e8-kube-api-access-pk98j\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.936400 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jt6xb" Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.936405 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jt6xb" event={"ID":"7e711c48-d9a0-4fd4-8d8d-734cb315d3e8","Type":"ContainerDied","Data":"aa34f48d0c855e0a9d4088217e5b66be2be38c66be8695d2510b2f6fb4694c0f"} Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.936784 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa34f48d0c855e0a9d4088217e5b66be2be38c66be8695d2510b2f6fb4694c0f" Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.940717 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"e68b1f6b7e100e7705aa37c427706e5914644a3fbb860d34c945f7395db615f9"} Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.940743 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"49d0ec08dc0653c2b4aee8f4c1faadaf1bdba9c56fc466213412b9fc202cad39"} Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.943395 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6","Type":"ContainerStarted","Data":"f8811158aa410033c4850052e5f64091ae9d78c2cd5b4b4285c898d9d4837c55"} Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.943721 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.945360 4742 generic.go:334] "Generic (PLEG): container finished" podID="0d71d306-a987-411e-82fe-e18450aa18a2" containerID="ae2be08fc5ec8464794b9d028f78ef7f5e9da6e8e3861cfa52e24654763af4af" exitCode=0 Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.945465 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d71d306-a987-411e-82fe-e18450aa18a2","Type":"ContainerDied","Data":"ae2be08fc5ec8464794b9d028f78ef7f5e9da6e8e3861cfa52e24654763af4af"} Mar 17 11:31:58 crc kubenswrapper[4742]: I0317 11:31:58.978393 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=41.422372739 podStartE2EDuration="1m11.978374606s" podCreationTimestamp="2026-03-17 11:30:47 +0000 UTC" firstStartedPulling="2026-03-17 11:30:53.57032566 +0000 UTC m=+1156.696453418" lastFinishedPulling="2026-03-17 11:31:24.126327527 +0000 UTC m=+1187.252455285" observedRunningTime="2026-03-17 11:31:58.973544018 +0000 UTC m=+1222.099671796" watchObservedRunningTime="2026-03-17 11:31:58.978374606 +0000 UTC m=+1222.104502364" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.358081 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.491826 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab12177c-bfc5-40c8-9468-81096ef2ac5e-scripts\") pod \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.492202 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-log-ovn\") pod \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.492263 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-run\") pod \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.492352 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ab12177c-bfc5-40c8-9468-81096ef2ac5e" (UID: "ab12177c-bfc5-40c8-9468-81096ef2ac5e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.492406 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-run" (OuterVolumeSpecName: "var-run") pod "ab12177c-bfc5-40c8-9468-81096ef2ac5e" (UID: "ab12177c-bfc5-40c8-9468-81096ef2ac5e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.492476 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtkmx\" (UniqueName: \"kubernetes.io/projected/ab12177c-bfc5-40c8-9468-81096ef2ac5e-kube-api-access-wtkmx\") pod \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.492498 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-run-ovn\") pod \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.492582 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ab12177c-bfc5-40c8-9468-81096ef2ac5e" (UID: "ab12177c-bfc5-40c8-9468-81096ef2ac5e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.492947 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ab12177c-bfc5-40c8-9468-81096ef2ac5e-additional-scripts\") pod \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\" (UID: \"ab12177c-bfc5-40c8-9468-81096ef2ac5e\") " Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.493078 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab12177c-bfc5-40c8-9468-81096ef2ac5e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ab12177c-bfc5-40c8-9468-81096ef2ac5e" (UID: "ab12177c-bfc5-40c8-9468-81096ef2ac5e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.493163 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab12177c-bfc5-40c8-9468-81096ef2ac5e-scripts" (OuterVolumeSpecName: "scripts") pod "ab12177c-bfc5-40c8-9468-81096ef2ac5e" (UID: "ab12177c-bfc5-40c8-9468-81096ef2ac5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.493349 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab12177c-bfc5-40c8-9468-81096ef2ac5e-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.493361 4742 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.493370 4742 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-run\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.493378 4742 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab12177c-bfc5-40c8-9468-81096ef2ac5e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.493386 4742 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ab12177c-bfc5-40c8-9468-81096ef2ac5e-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.496115 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab12177c-bfc5-40c8-9468-81096ef2ac5e-kube-api-access-wtkmx" (OuterVolumeSpecName: "kube-api-access-wtkmx") pod "ab12177c-bfc5-40c8-9468-81096ef2ac5e" (UID: "ab12177c-bfc5-40c8-9468-81096ef2ac5e"). InnerVolumeSpecName "kube-api-access-wtkmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.595199 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtkmx\" (UniqueName: \"kubernetes.io/projected/ab12177c-bfc5-40c8-9468-81096ef2ac5e-kube-api-access-wtkmx\") on node \"crc\" DevicePath \"\"" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.956297 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4j5jz-config-vzbgt" event={"ID":"ab12177c-bfc5-40c8-9468-81096ef2ac5e","Type":"ContainerDied","Data":"972f82ef12540da5ad6341d69a4e02541676fce086fafa9691bbdcc30157084c"} Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.956993 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="972f82ef12540da5ad6341d69a4e02541676fce086fafa9691bbdcc30157084c" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.956316 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4j5jz-config-vzbgt" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.960971 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"7699e7c165923b7091f3194ef5f9ecfe8b7b42f5bb66e60a03e44c02f47cd8f0"} Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.961016 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"1b452153e295e9a087737bc8f4ed561655367054aada0fc43204ee8502c5ae1d"} Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.961027 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"00efd4f5b1cdf2cc12b0721cbc948f86668c40c1a9a1fa919a4a1a966a25542e"} Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.961036 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"dab736d45529405a31061e000ee455abfcc522a092584131903d192af838e9a3"} Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.964078 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d71d306-a987-411e-82fe-e18450aa18a2","Type":"ContainerStarted","Data":"0f7789cc70ff5ae1940a1e73e599735fcfd8df82cb6befebbe23b70ff21d4d9a"} Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.964247 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:31:59 crc kubenswrapper[4742]: I0317 11:31:59.990037 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371963.86476 podStartE2EDuration="1m12.990016587s" podCreationTimestamp="2026-03-17 11:30:47 +0000 UTC" firstStartedPulling="2026-03-17 11:30:53.570737481 +0000 UTC m=+1156.696865239" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:31:59.987376997 +0000 UTC m=+1223.113504755" watchObservedRunningTime="2026-03-17 11:31:59.990016587 +0000 UTC m=+1223.116144355" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.149034 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562452-vfzcm"] Mar 17 11:32:00 crc kubenswrapper[4742]: E0317 11:32:00.149456 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab12177c-bfc5-40c8-9468-81096ef2ac5e" containerName="ovn-config" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.149478 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab12177c-bfc5-40c8-9468-81096ef2ac5e" containerName="ovn-config" Mar 17 11:32:00 crc kubenswrapper[4742]: E0317 11:32:00.149492 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e711c48-d9a0-4fd4-8d8d-734cb315d3e8" containerName="mariadb-account-create-update" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.149500 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e711c48-d9a0-4fd4-8d8d-734cb315d3e8" containerName="mariadb-account-create-update" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.149686 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab12177c-bfc5-40c8-9468-81096ef2ac5e" containerName="ovn-config" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.149709 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e711c48-d9a0-4fd4-8d8d-734cb315d3e8" containerName="mariadb-account-create-update" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.150407 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562452-vfzcm" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.154465 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.154703 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.160155 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562452-vfzcm"] Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.189665 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.204024 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6m69\" (UniqueName: \"kubernetes.io/projected/86f1bf7f-425d-46dc-945b-64afbc107101-kube-api-access-f6m69\") pod \"auto-csr-approver-29562452-vfzcm\" (UID: \"86f1bf7f-425d-46dc-945b-64afbc107101\") " pod="openshift-infra/auto-csr-approver-29562452-vfzcm" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.305203 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6m69\" (UniqueName: \"kubernetes.io/projected/86f1bf7f-425d-46dc-945b-64afbc107101-kube-api-access-f6m69\") pod \"auto-csr-approver-29562452-vfzcm\" (UID: \"86f1bf7f-425d-46dc-945b-64afbc107101\") " pod="openshift-infra/auto-csr-approver-29562452-vfzcm" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.307006 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jt6xb"] Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.313509 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jt6xb"] Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.327431 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6m69\" (UniqueName: \"kubernetes.io/projected/86f1bf7f-425d-46dc-945b-64afbc107101-kube-api-access-f6m69\") pod \"auto-csr-approver-29562452-vfzcm\" (UID: \"86f1bf7f-425d-46dc-945b-64afbc107101\") " pod="openshift-infra/auto-csr-approver-29562452-vfzcm" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.451756 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-4j5jz-config-vzbgt"] Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.457807 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-4j5jz-config-vzbgt"] Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.500032 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562452-vfzcm" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.586718 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4j5jz-config-rcrvr"] Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.587577 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.589427 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.600516 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4j5jz-config-rcrvr"] Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.609598 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq4fl\" (UniqueName: \"kubernetes.io/projected/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-kube-api-access-fq4fl\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.609650 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-run-ovn\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.609678 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-run\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.609731 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-log-ovn\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.609759 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-additional-scripts\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.609806 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-scripts\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.679705 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e711c48-d9a0-4fd4-8d8d-734cb315d3e8" path="/var/lib/kubelet/pods/7e711c48-d9a0-4fd4-8d8d-734cb315d3e8/volumes" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.680482 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab12177c-bfc5-40c8-9468-81096ef2ac5e" path="/var/lib/kubelet/pods/ab12177c-bfc5-40c8-9468-81096ef2ac5e/volumes" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.711822 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-run-ovn\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.711881 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-run\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.711959 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-log-ovn\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.712004 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-additional-scripts\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.712054 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-scripts\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.712094 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq4fl\" (UniqueName: \"kubernetes.io/projected/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-kube-api-access-fq4fl\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.713577 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-run-ovn\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.713631 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-run\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.713815 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-log-ovn\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.714397 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-additional-scripts\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.717331 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-scripts\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.738971 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq4fl\" (UniqueName: \"kubernetes.io/projected/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-kube-api-access-fq4fl\") pod \"ovn-controller-4j5jz-config-rcrvr\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:00 crc kubenswrapper[4742]: I0317 11:32:00.902304 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:01 crc kubenswrapper[4742]: I0317 11:32:01.017174 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562452-vfzcm"] Mar 17 11:32:01 crc kubenswrapper[4742]: W0317 11:32:01.407896 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9078fef3_bc6f_4aaa_b1f3_634b9cde7da2.slice/crio-6689337475efbfab3b0e8da81a1d88fdcade53a64e4ddacebf9a65c7a1240a7f WatchSource:0}: Error finding container 6689337475efbfab3b0e8da81a1d88fdcade53a64e4ddacebf9a65c7a1240a7f: Status 404 returned error can't find the container with id 6689337475efbfab3b0e8da81a1d88fdcade53a64e4ddacebf9a65c7a1240a7f Mar 17 11:32:01 crc kubenswrapper[4742]: I0317 11:32:01.409570 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4j5jz-config-rcrvr"] Mar 17 11:32:02 crc kubenswrapper[4742]: I0317 11:32:02.008460 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562452-vfzcm" event={"ID":"86f1bf7f-425d-46dc-945b-64afbc107101","Type":"ContainerStarted","Data":"dba62a68fa1791ef695eef7620a377d9c09591f9a82dc62e6dc06e04fb98f16f"} Mar 17 11:32:02 crc kubenswrapper[4742]: I0317 11:32:02.009993 4742 generic.go:334] "Generic (PLEG): container finished" podID="9078fef3-bc6f-4aaa-b1f3-634b9cde7da2" containerID="8eb2067fd4abccdf70a7351181706842d5af7477c7771cd486a8c0c2d41da946" exitCode=0 Mar 17 11:32:02 crc kubenswrapper[4742]: I0317 11:32:02.010018 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4j5jz-config-rcrvr" event={"ID":"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2","Type":"ContainerDied","Data":"8eb2067fd4abccdf70a7351181706842d5af7477c7771cd486a8c0c2d41da946"} Mar 17 11:32:02 crc kubenswrapper[4742]: I0317 11:32:02.010033 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4j5jz-config-rcrvr" event={"ID":"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2","Type":"ContainerStarted","Data":"6689337475efbfab3b0e8da81a1d88fdcade53a64e4ddacebf9a65c7a1240a7f"} Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.020408 4742 generic.go:334] "Generic (PLEG): container finished" podID="86f1bf7f-425d-46dc-945b-64afbc107101" containerID="d1650d3cc9b26e02486db88fcd53040074b628ab99a1931281a1204591ad8624" exitCode=0 Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.020476 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562452-vfzcm" event={"ID":"86f1bf7f-425d-46dc-945b-64afbc107101","Type":"ContainerDied","Data":"d1650d3cc9b26e02486db88fcd53040074b628ab99a1931281a1204591ad8624"} Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.029847 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"3ca2b834a294d548216b2038dcf23d4797e75cd8f23dccb81edf6ac2a247fc26"} Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.029877 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"3978659c20e4c3c2763d82b71a4a65557d44f5bac06f056246b9b0187111bfd7"} Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.029887 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"363df1097736a9ebd5acfc58715ca54d2a81cf95ff727839de2755aff8f91f66"} Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.029896 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"7273f83140a0ae418505e6ebd168fc89768c8e8acb8697bc23236733e1a6663d"} Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.029921 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"8303644833f48e5deac8a17f50207dab7355f090b1c31eb2ada1b73f6260fe4d"} Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.279517 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.376669 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-run-ovn\") pod \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.376749 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-scripts\") pod \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.376804 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-log-ovn\") pod \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.376805 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9078fef3-bc6f-4aaa-b1f3-634b9cde7da2" (UID: "9078fef3-bc6f-4aaa-b1f3-634b9cde7da2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.376859 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-additional-scripts\") pod \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.376984 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq4fl\" (UniqueName: \"kubernetes.io/projected/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-kube-api-access-fq4fl\") pod \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.377034 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-run\") pod \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\" (UID: \"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2\") " Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.377319 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9078fef3-bc6f-4aaa-b1f3-634b9cde7da2" (UID: "9078fef3-bc6f-4aaa-b1f3-634b9cde7da2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.377362 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-run" (OuterVolumeSpecName: "var-run") pod "9078fef3-bc6f-4aaa-b1f3-634b9cde7da2" (UID: "9078fef3-bc6f-4aaa-b1f3-634b9cde7da2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.377505 4742 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-run\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.377519 4742 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.377530 4742 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.377871 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9078fef3-bc6f-4aaa-b1f3-634b9cde7da2" (UID: "9078fef3-bc6f-4aaa-b1f3-634b9cde7da2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.378029 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-scripts" (OuterVolumeSpecName: "scripts") pod "9078fef3-bc6f-4aaa-b1f3-634b9cde7da2" (UID: "9078fef3-bc6f-4aaa-b1f3-634b9cde7da2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.381219 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-kube-api-access-fq4fl" (OuterVolumeSpecName: "kube-api-access-fq4fl") pod "9078fef3-bc6f-4aaa-b1f3-634b9cde7da2" (UID: "9078fef3-bc6f-4aaa-b1f3-634b9cde7da2"). InnerVolumeSpecName "kube-api-access-fq4fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.478940 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq4fl\" (UniqueName: \"kubernetes.io/projected/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-kube-api-access-fq4fl\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.478975 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:03 crc kubenswrapper[4742]: I0317 11:32:03.478988 4742 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.040826 4742 generic.go:334] "Generic (PLEG): container finished" podID="828f87ee-72a4-43e5-88f7-0a15975b90a5" containerID="2ba37516202552ef925c69fed60b69b58405874b3021fb7d38fb547f33098c55" exitCode=0 Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.040938 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6sr2t" event={"ID":"828f87ee-72a4-43e5-88f7-0a15975b90a5","Type":"ContainerDied","Data":"2ba37516202552ef925c69fed60b69b58405874b3021fb7d38fb547f33098c55"} Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.043202 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4j5jz-config-rcrvr" event={"ID":"9078fef3-bc6f-4aaa-b1f3-634b9cde7da2","Type":"ContainerDied","Data":"6689337475efbfab3b0e8da81a1d88fdcade53a64e4ddacebf9a65c7a1240a7f"} Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.043247 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6689337475efbfab3b0e8da81a1d88fdcade53a64e4ddacebf9a65c7a1240a7f" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.043313 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4j5jz-config-rcrvr" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.053857 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"7202c5d103ba3c2a78c44c69f2ac311acf7ea0ade9c7998a9e208aa13a1def62"} Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.053902 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"be22c821-2e25-47ed-938d-c748fc55a4c6","Type":"ContainerStarted","Data":"38f5c656979ac4390c9ff9d90e79963f2aa01dd21ed81fb4f29b466b460b510a"} Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.080459 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7kc77"] Mar 17 11:32:04 crc kubenswrapper[4742]: E0317 11:32:04.080848 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9078fef3-bc6f-4aaa-b1f3-634b9cde7da2" containerName="ovn-config" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.080865 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="9078fef3-bc6f-4aaa-b1f3-634b9cde7da2" containerName="ovn-config" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.081101 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="9078fef3-bc6f-4aaa-b1f3-634b9cde7da2" containerName="ovn-config" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.081688 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7kc77" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.084336 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.092690 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7kc77"] Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.126572 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=24.643451227 podStartE2EDuration="30.126554326s" podCreationTimestamp="2026-03-17 11:31:34 +0000 UTC" firstStartedPulling="2026-03-17 11:31:56.362875217 +0000 UTC m=+1219.489002975" lastFinishedPulling="2026-03-17 11:32:01.845978306 +0000 UTC m=+1224.972106074" observedRunningTime="2026-03-17 11:32:04.124492812 +0000 UTC m=+1227.250620570" watchObservedRunningTime="2026-03-17 11:32:04.126554326 +0000 UTC m=+1227.252682094" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.191375 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cfb2b46-5e43-4332-901d-85ca6fbc2ad8-operator-scripts\") pod \"root-account-create-update-7kc77\" (UID: \"4cfb2b46-5e43-4332-901d-85ca6fbc2ad8\") " pod="openstack/root-account-create-update-7kc77" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.191771 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnm7p\" (UniqueName: \"kubernetes.io/projected/4cfb2b46-5e43-4332-901d-85ca6fbc2ad8-kube-api-access-fnm7p\") pod \"root-account-create-update-7kc77\" (UID: \"4cfb2b46-5e43-4332-901d-85ca6fbc2ad8\") " pod="openstack/root-account-create-update-7kc77" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.293434 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnm7p\" (UniqueName: \"kubernetes.io/projected/4cfb2b46-5e43-4332-901d-85ca6fbc2ad8-kube-api-access-fnm7p\") pod \"root-account-create-update-7kc77\" (UID: \"4cfb2b46-5e43-4332-901d-85ca6fbc2ad8\") " pod="openstack/root-account-create-update-7kc77" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.293520 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cfb2b46-5e43-4332-901d-85ca6fbc2ad8-operator-scripts\") pod \"root-account-create-update-7kc77\" (UID: \"4cfb2b46-5e43-4332-901d-85ca6fbc2ad8\") " pod="openstack/root-account-create-update-7kc77" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.294686 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cfb2b46-5e43-4332-901d-85ca6fbc2ad8-operator-scripts\") pod \"root-account-create-update-7kc77\" (UID: \"4cfb2b46-5e43-4332-901d-85ca6fbc2ad8\") " pod="openstack/root-account-create-update-7kc77" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.327717 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnm7p\" (UniqueName: \"kubernetes.io/projected/4cfb2b46-5e43-4332-901d-85ca6fbc2ad8-kube-api-access-fnm7p\") pod \"root-account-create-update-7kc77\" (UID: \"4cfb2b46-5e43-4332-901d-85ca6fbc2ad8\") " pod="openstack/root-account-create-update-7kc77" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.362248 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-4j5jz-config-rcrvr"] Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.373879 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-4j5jz-config-rcrvr"] Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.414263 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562452-vfzcm" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.416849 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7kc77" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.473368 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-wdvns"] Mar 17 11:32:04 crc kubenswrapper[4742]: E0317 11:32:04.473715 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f1bf7f-425d-46dc-945b-64afbc107101" containerName="oc" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.473732 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f1bf7f-425d-46dc-945b-64afbc107101" containerName="oc" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.474656 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f1bf7f-425d-46dc-945b-64afbc107101" containerName="oc" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.477147 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.481782 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.503495 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6m69\" (UniqueName: \"kubernetes.io/projected/86f1bf7f-425d-46dc-945b-64afbc107101-kube-api-access-f6m69\") pod \"86f1bf7f-425d-46dc-945b-64afbc107101\" (UID: \"86f1bf7f-425d-46dc-945b-64afbc107101\") " Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.504687 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rws4j\" (UniqueName: \"kubernetes.io/projected/f4c5f514-3700-48be-bc71-77a939cb171e-kube-api-access-rws4j\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.506621 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-config\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.510679 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.510825 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.509043 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-wdvns"] Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.512972 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.513121 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-dns-svc\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.517053 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f1bf7f-425d-46dc-945b-64afbc107101-kube-api-access-f6m69" (OuterVolumeSpecName: "kube-api-access-f6m69") pod "86f1bf7f-425d-46dc-945b-64afbc107101" (UID: "86f1bf7f-425d-46dc-945b-64afbc107101"). InnerVolumeSpecName "kube-api-access-f6m69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.615021 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.615297 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.615342 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.615387 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-dns-svc\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.615494 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rws4j\" (UniqueName: \"kubernetes.io/projected/f4c5f514-3700-48be-bc71-77a939cb171e-kube-api-access-rws4j\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.615524 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-config\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.615583 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6m69\" (UniqueName: \"kubernetes.io/projected/86f1bf7f-425d-46dc-945b-64afbc107101-kube-api-access-f6m69\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.615889 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.616327 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.616441 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-config\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.616533 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-dns-svc\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.616740 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.633298 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rws4j\" (UniqueName: \"kubernetes.io/projected/f4c5f514-3700-48be-bc71-77a939cb171e-kube-api-access-rws4j\") pod \"dnsmasq-dns-764c5664d7-wdvns\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.673895 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9078fef3-bc6f-4aaa-b1f3-634b9cde7da2" path="/var/lib/kubelet/pods/9078fef3-bc6f-4aaa-b1f3-634b9cde7da2/volumes" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.840759 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:04 crc kubenswrapper[4742]: I0317 11:32:04.913927 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7kc77"] Mar 17 11:32:04 crc kubenswrapper[4742]: W0317 11:32:04.920501 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cfb2b46_5e43_4332_901d_85ca6fbc2ad8.slice/crio-cd53a591f1a4288a8c7b991fa61916ab398d52b6c307891952d5b5383b315666 WatchSource:0}: Error finding container cd53a591f1a4288a8c7b991fa61916ab398d52b6c307891952d5b5383b315666: Status 404 returned error can't find the container with id cd53a591f1a4288a8c7b991fa61916ab398d52b6c307891952d5b5383b315666 Mar 17 11:32:05 crc kubenswrapper[4742]: I0317 11:32:05.063661 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7kc77" event={"ID":"4cfb2b46-5e43-4332-901d-85ca6fbc2ad8","Type":"ContainerStarted","Data":"cd53a591f1a4288a8c7b991fa61916ab398d52b6c307891952d5b5383b315666"} Mar 17 11:32:05 crc kubenswrapper[4742]: I0317 11:32:05.065569 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562452-vfzcm" event={"ID":"86f1bf7f-425d-46dc-945b-64afbc107101","Type":"ContainerDied","Data":"dba62a68fa1791ef695eef7620a377d9c09591f9a82dc62e6dc06e04fb98f16f"} Mar 17 11:32:05 crc kubenswrapper[4742]: I0317 11:32:05.065607 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dba62a68fa1791ef695eef7620a377d9c09591f9a82dc62e6dc06e04fb98f16f" Mar 17 11:32:05 crc kubenswrapper[4742]: I0317 11:32:05.065662 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562452-vfzcm" Mar 17 11:32:05 crc kubenswrapper[4742]: I0317 11:32:05.305868 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-wdvns"] Mar 17 11:32:05 crc kubenswrapper[4742]: W0317 11:32:05.310351 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4c5f514_3700_48be_bc71_77a939cb171e.slice/crio-ca2347ec0115e06573dc3ada26aa0937d075479bd71a1c014e0eb82ef7b9bb97 WatchSource:0}: Error finding container ca2347ec0115e06573dc3ada26aa0937d075479bd71a1c014e0eb82ef7b9bb97: Status 404 returned error can't find the container with id ca2347ec0115e06573dc3ada26aa0937d075479bd71a1c014e0eb82ef7b9bb97 Mar 17 11:32:05 crc kubenswrapper[4742]: I0317 11:32:05.489066 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562446-67mbt"] Mar 17 11:32:05 crc kubenswrapper[4742]: I0317 11:32:05.497756 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562446-67mbt"] Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.056412 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6sr2t" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.075170 4742 generic.go:334] "Generic (PLEG): container finished" podID="f4c5f514-3700-48be-bc71-77a939cb171e" containerID="92d0f99375060c4cdb285c1c59463e9edd7cbcfb69ef483440d13c42d590a10e" exitCode=0 Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.075229 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-wdvns" event={"ID":"f4c5f514-3700-48be-bc71-77a939cb171e","Type":"ContainerDied","Data":"92d0f99375060c4cdb285c1c59463e9edd7cbcfb69ef483440d13c42d590a10e"} Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.075252 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-wdvns" event={"ID":"f4c5f514-3700-48be-bc71-77a939cb171e","Type":"ContainerStarted","Data":"ca2347ec0115e06573dc3ada26aa0937d075479bd71a1c014e0eb82ef7b9bb97"} Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.077839 4742 generic.go:334] "Generic (PLEG): container finished" podID="4cfb2b46-5e43-4332-901d-85ca6fbc2ad8" containerID="eb0dc37886b13f72e39bed5c9ab7ebbedd6f9e9cbd96d9aff5c8c8e4cf61f6c8" exitCode=0 Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.077869 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7kc77" event={"ID":"4cfb2b46-5e43-4332-901d-85ca6fbc2ad8","Type":"ContainerDied","Data":"eb0dc37886b13f72e39bed5c9ab7ebbedd6f9e9cbd96d9aff5c8c8e4cf61f6c8"} Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.079705 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6sr2t" event={"ID":"828f87ee-72a4-43e5-88f7-0a15975b90a5","Type":"ContainerDied","Data":"42e7e7f6ead65811bd03dfa93f7c9130753d82807fe03b45caa667f47a6dc3c1"} Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.079731 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42e7e7f6ead65811bd03dfa93f7c9130753d82807fe03b45caa667f47a6dc3c1" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.079778 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6sr2t" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.142609 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-db-sync-config-data\") pod \"828f87ee-72a4-43e5-88f7-0a15975b90a5\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.142642 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-combined-ca-bundle\") pod \"828f87ee-72a4-43e5-88f7-0a15975b90a5\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.142665 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltvl6\" (UniqueName: \"kubernetes.io/projected/828f87ee-72a4-43e5-88f7-0a15975b90a5-kube-api-access-ltvl6\") pod \"828f87ee-72a4-43e5-88f7-0a15975b90a5\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.142731 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-config-data\") pod \"828f87ee-72a4-43e5-88f7-0a15975b90a5\" (UID: \"828f87ee-72a4-43e5-88f7-0a15975b90a5\") " Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.150520 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/828f87ee-72a4-43e5-88f7-0a15975b90a5-kube-api-access-ltvl6" (OuterVolumeSpecName: "kube-api-access-ltvl6") pod "828f87ee-72a4-43e5-88f7-0a15975b90a5" (UID: "828f87ee-72a4-43e5-88f7-0a15975b90a5"). InnerVolumeSpecName "kube-api-access-ltvl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.155340 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "828f87ee-72a4-43e5-88f7-0a15975b90a5" (UID: "828f87ee-72a4-43e5-88f7-0a15975b90a5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.193387 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "828f87ee-72a4-43e5-88f7-0a15975b90a5" (UID: "828f87ee-72a4-43e5-88f7-0a15975b90a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.210437 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-config-data" (OuterVolumeSpecName: "config-data") pod "828f87ee-72a4-43e5-88f7-0a15975b90a5" (UID: "828f87ee-72a4-43e5-88f7-0a15975b90a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.245391 4742 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.245425 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.245439 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltvl6\" (UniqueName: \"kubernetes.io/projected/828f87ee-72a4-43e5-88f7-0a15975b90a5-kube-api-access-ltvl6\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.245452 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828f87ee-72a4-43e5-88f7-0a15975b90a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.456655 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-wdvns"] Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.476240 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-w62xn"] Mar 17 11:32:06 crc kubenswrapper[4742]: E0317 11:32:06.476578 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="828f87ee-72a4-43e5-88f7-0a15975b90a5" containerName="glance-db-sync" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.476599 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="828f87ee-72a4-43e5-88f7-0a15975b90a5" containerName="glance-db-sync" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.476779 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="828f87ee-72a4-43e5-88f7-0a15975b90a5" containerName="glance-db-sync" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.478192 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.500305 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-w62xn"] Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.556894 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.557071 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.557250 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-config\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.557281 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.557339 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.557362 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7pfm\" (UniqueName: \"kubernetes.io/projected/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-kube-api-access-f7pfm\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.658741 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-config\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.658786 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.658824 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.658866 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7pfm\" (UniqueName: \"kubernetes.io/projected/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-kube-api-access-f7pfm\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.658952 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.658982 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.659743 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-config\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.659783 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.660026 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.660126 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.660554 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.671966 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea540b21-cd5f-485f-863c-24bc91beab7e" path="/var/lib/kubelet/pods/ea540b21-cd5f-485f-863c-24bc91beab7e/volumes" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.677800 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7pfm\" (UniqueName: \"kubernetes.io/projected/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-kube-api-access-f7pfm\") pod \"dnsmasq-dns-74f6bcbc87-w62xn\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:06 crc kubenswrapper[4742]: I0317 11:32:06.795338 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:08 crc kubenswrapper[4742]: I0317 11:32:08.940482 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7kc77" Mar 17 11:32:08 crc kubenswrapper[4742]: I0317 11:32:08.996881 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnm7p\" (UniqueName: \"kubernetes.io/projected/4cfb2b46-5e43-4332-901d-85ca6fbc2ad8-kube-api-access-fnm7p\") pod \"4cfb2b46-5e43-4332-901d-85ca6fbc2ad8\" (UID: \"4cfb2b46-5e43-4332-901d-85ca6fbc2ad8\") " Mar 17 11:32:08 crc kubenswrapper[4742]: I0317 11:32:08.997065 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cfb2b46-5e43-4332-901d-85ca6fbc2ad8-operator-scripts\") pod \"4cfb2b46-5e43-4332-901d-85ca6fbc2ad8\" (UID: \"4cfb2b46-5e43-4332-901d-85ca6fbc2ad8\") " Mar 17 11:32:08 crc kubenswrapper[4742]: I0317 11:32:08.998439 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cfb2b46-5e43-4332-901d-85ca6fbc2ad8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cfb2b46-5e43-4332-901d-85ca6fbc2ad8" (UID: "4cfb2b46-5e43-4332-901d-85ca6fbc2ad8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.007953 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cfb2b46-5e43-4332-901d-85ca6fbc2ad8-kube-api-access-fnm7p" (OuterVolumeSpecName: "kube-api-access-fnm7p") pod "4cfb2b46-5e43-4332-901d-85ca6fbc2ad8" (UID: "4cfb2b46-5e43-4332-901d-85ca6fbc2ad8"). InnerVolumeSpecName "kube-api-access-fnm7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.099663 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cfb2b46-5e43-4332-901d-85ca6fbc2ad8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.099689 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnm7p\" (UniqueName: \"kubernetes.io/projected/4cfb2b46-5e43-4332-901d-85ca6fbc2ad8-kube-api-access-fnm7p\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.102843 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7kc77" event={"ID":"4cfb2b46-5e43-4332-901d-85ca6fbc2ad8","Type":"ContainerDied","Data":"cd53a591f1a4288a8c7b991fa61916ab398d52b6c307891952d5b5383b315666"} Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.102881 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd53a591f1a4288a8c7b991fa61916ab398d52b6c307891952d5b5383b315666" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.102957 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7kc77" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.283151 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.365403 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-w62xn"] Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.619657 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-dps6c"] Mar 17 11:32:09 crc kubenswrapper[4742]: E0317 11:32:09.620233 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfb2b46-5e43-4332-901d-85ca6fbc2ad8" containerName="mariadb-account-create-update" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.620246 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfb2b46-5e43-4332-901d-85ca6fbc2ad8" containerName="mariadb-account-create-update" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.620396 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cfb2b46-5e43-4332-901d-85ca6fbc2ad8" containerName="mariadb-account-create-update" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.620882 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dps6c" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.637682 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dps6c"] Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.713634 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-5f9b-account-create-update-7kbcm"] Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.714011 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-674qz\" (UniqueName: \"kubernetes.io/projected/1e82963e-fa88-4b3c-847c-4fc0976e63b0-kube-api-access-674qz\") pod \"cinder-db-create-dps6c\" (UID: \"1e82963e-fa88-4b3c-847c-4fc0976e63b0\") " pod="openstack/cinder-db-create-dps6c" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.714161 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e82963e-fa88-4b3c-847c-4fc0976e63b0-operator-scripts\") pod \"cinder-db-create-dps6c\" (UID: \"1e82963e-fa88-4b3c-847c-4fc0976e63b0\") " pod="openstack/cinder-db-create-dps6c" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.714656 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5f9b-account-create-update-7kbcm" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.718121 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.722425 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5f9b-account-create-update-7kbcm"] Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.817220 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e82963e-fa88-4b3c-847c-4fc0976e63b0-operator-scripts\") pod \"cinder-db-create-dps6c\" (UID: \"1e82963e-fa88-4b3c-847c-4fc0976e63b0\") " pod="openstack/cinder-db-create-dps6c" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.817305 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ef3446-d4b4-42dd-862f-0e4c548a9752-operator-scripts\") pod \"cinder-5f9b-account-create-update-7kbcm\" (UID: \"45ef3446-d4b4-42dd-862f-0e4c548a9752\") " pod="openstack/cinder-5f9b-account-create-update-7kbcm" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.817364 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-674qz\" (UniqueName: \"kubernetes.io/projected/1e82963e-fa88-4b3c-847c-4fc0976e63b0-kube-api-access-674qz\") pod \"cinder-db-create-dps6c\" (UID: \"1e82963e-fa88-4b3c-847c-4fc0976e63b0\") " pod="openstack/cinder-db-create-dps6c" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.817393 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g552g\" (UniqueName: \"kubernetes.io/projected/45ef3446-d4b4-42dd-862f-0e4c548a9752-kube-api-access-g552g\") pod \"cinder-5f9b-account-create-update-7kbcm\" (UID: \"45ef3446-d4b4-42dd-862f-0e4c548a9752\") " pod="openstack/cinder-5f9b-account-create-update-7kbcm" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.818144 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e82963e-fa88-4b3c-847c-4fc0976e63b0-operator-scripts\") pod \"cinder-db-create-dps6c\" (UID: \"1e82963e-fa88-4b3c-847c-4fc0976e63b0\") " pod="openstack/cinder-db-create-dps6c" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.823890 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-dhlk7"] Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.826760 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dhlk7" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.860479 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-674qz\" (UniqueName: \"kubernetes.io/projected/1e82963e-fa88-4b3c-847c-4fc0976e63b0-kube-api-access-674qz\") pod \"cinder-db-create-dps6c\" (UID: \"1e82963e-fa88-4b3c-847c-4fc0976e63b0\") " pod="openstack/cinder-db-create-dps6c" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.885418 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dhlk7"] Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.920002 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g552g\" (UniqueName: \"kubernetes.io/projected/45ef3446-d4b4-42dd-862f-0e4c548a9752-kube-api-access-g552g\") pod \"cinder-5f9b-account-create-update-7kbcm\" (UID: \"45ef3446-d4b4-42dd-862f-0e4c548a9752\") " pod="openstack/cinder-5f9b-account-create-update-7kbcm" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.920103 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d11a1d54-dde6-466c-a500-72fd1c349db3-operator-scripts\") pod \"barbican-db-create-dhlk7\" (UID: \"d11a1d54-dde6-466c-a500-72fd1c349db3\") " pod="openstack/barbican-db-create-dhlk7" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.920161 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scq5v\" (UniqueName: \"kubernetes.io/projected/d11a1d54-dde6-466c-a500-72fd1c349db3-kube-api-access-scq5v\") pod \"barbican-db-create-dhlk7\" (UID: \"d11a1d54-dde6-466c-a500-72fd1c349db3\") " pod="openstack/barbican-db-create-dhlk7" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.920246 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ef3446-d4b4-42dd-862f-0e4c548a9752-operator-scripts\") pod \"cinder-5f9b-account-create-update-7kbcm\" (UID: \"45ef3446-d4b4-42dd-862f-0e4c548a9752\") " pod="openstack/cinder-5f9b-account-create-update-7kbcm" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.921167 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ef3446-d4b4-42dd-862f-0e4c548a9752-operator-scripts\") pod \"cinder-5f9b-account-create-update-7kbcm\" (UID: \"45ef3446-d4b4-42dd-862f-0e4c548a9752\") " pod="openstack/cinder-5f9b-account-create-update-7kbcm" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.926277 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-9srm9"] Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.927270 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9srm9" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.930846 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.931864 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fjbw9" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.932120 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.932185 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.933636 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9srm9"] Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.935696 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dps6c" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.949334 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g552g\" (UniqueName: \"kubernetes.io/projected/45ef3446-d4b4-42dd-862f-0e4c548a9752-kube-api-access-g552g\") pod \"cinder-5f9b-account-create-update-7kbcm\" (UID: \"45ef3446-d4b4-42dd-862f-0e4c548a9752\") " pod="openstack/cinder-5f9b-account-create-update-7kbcm" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.951574 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f965-account-create-update-mfv8t"] Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.953307 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f965-account-create-update-mfv8t" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.955129 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 17 11:32:09 crc kubenswrapper[4742]: I0317 11:32:09.968845 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f965-account-create-update-mfv8t"] Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.023597 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcxvg\" (UniqueName: \"kubernetes.io/projected/204cceda-eecf-48b2-b808-d2981ea6f0be-kube-api-access-qcxvg\") pod \"keystone-db-sync-9srm9\" (UID: \"204cceda-eecf-48b2-b808-d2981ea6f0be\") " pod="openstack/keystone-db-sync-9srm9" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.023894 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d11a1d54-dde6-466c-a500-72fd1c349db3-operator-scripts\") pod \"barbican-db-create-dhlk7\" (UID: \"d11a1d54-dde6-466c-a500-72fd1c349db3\") " pod="openstack/barbican-db-create-dhlk7" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.024020 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scq5v\" (UniqueName: \"kubernetes.io/projected/d11a1d54-dde6-466c-a500-72fd1c349db3-kube-api-access-scq5v\") pod \"barbican-db-create-dhlk7\" (UID: \"d11a1d54-dde6-466c-a500-72fd1c349db3\") " pod="openstack/barbican-db-create-dhlk7" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.024121 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949c94b8-282b-40f0-bba5-3865562af774-operator-scripts\") pod \"neutron-f965-account-create-update-mfv8t\" (UID: \"949c94b8-282b-40f0-bba5-3865562af774\") " pod="openstack/neutron-f965-account-create-update-mfv8t" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.024222 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kn2m\" (UniqueName: \"kubernetes.io/projected/949c94b8-282b-40f0-bba5-3865562af774-kube-api-access-2kn2m\") pod \"neutron-f965-account-create-update-mfv8t\" (UID: \"949c94b8-282b-40f0-bba5-3865562af774\") " pod="openstack/neutron-f965-account-create-update-mfv8t" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.024307 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204cceda-eecf-48b2-b808-d2981ea6f0be-config-data\") pod \"keystone-db-sync-9srm9\" (UID: \"204cceda-eecf-48b2-b808-d2981ea6f0be\") " pod="openstack/keystone-db-sync-9srm9" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.024387 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204cceda-eecf-48b2-b808-d2981ea6f0be-combined-ca-bundle\") pod \"keystone-db-sync-9srm9\" (UID: \"204cceda-eecf-48b2-b808-d2981ea6f0be\") " pod="openstack/keystone-db-sync-9srm9" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.024557 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d11a1d54-dde6-466c-a500-72fd1c349db3-operator-scripts\") pod \"barbican-db-create-dhlk7\" (UID: \"d11a1d54-dde6-466c-a500-72fd1c349db3\") " pod="openstack/barbican-db-create-dhlk7" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.032699 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5f9b-account-create-update-7kbcm" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.036254 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-k2nnj"] Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.037401 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k2nnj" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.044501 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scq5v\" (UniqueName: \"kubernetes.io/projected/d11a1d54-dde6-466c-a500-72fd1c349db3-kube-api-access-scq5v\") pod \"barbican-db-create-dhlk7\" (UID: \"d11a1d54-dde6-466c-a500-72fd1c349db3\") " pod="openstack/barbican-db-create-dhlk7" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.047835 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-k2nnj"] Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.072845 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-57b1-account-create-update-rgxfk"] Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.074002 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-57b1-account-create-update-rgxfk" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.079344 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.091839 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-57b1-account-create-update-rgxfk"] Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.125433 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-wdvns" event={"ID":"f4c5f514-3700-48be-bc71-77a939cb171e","Type":"ContainerStarted","Data":"f9ccc874c4ceab9534d664f1ca6ea21b9c71429cd16a0a0e3849b0d8c8a1e15d"} Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.125571 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-wdvns" podUID="f4c5f514-3700-48be-bc71-77a939cb171e" containerName="dnsmasq-dns" containerID="cri-o://f9ccc874c4ceab9534d664f1ca6ea21b9c71429cd16a0a0e3849b0d8c8a1e15d" gracePeriod=10 Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.125688 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204cceda-eecf-48b2-b808-d2981ea6f0be-config-data\") pod \"keystone-db-sync-9srm9\" (UID: \"204cceda-eecf-48b2-b808-d2981ea6f0be\") " pod="openstack/keystone-db-sync-9srm9" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.125716 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204cceda-eecf-48b2-b808-d2981ea6f0be-combined-ca-bundle\") pod \"keystone-db-sync-9srm9\" (UID: \"204cceda-eecf-48b2-b808-d2981ea6f0be\") " pod="openstack/keystone-db-sync-9srm9" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.125796 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcxvg\" (UniqueName: \"kubernetes.io/projected/204cceda-eecf-48b2-b808-d2981ea6f0be-kube-api-access-qcxvg\") pod \"keystone-db-sync-9srm9\" (UID: \"204cceda-eecf-48b2-b808-d2981ea6f0be\") " pod="openstack/keystone-db-sync-9srm9" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.125825 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.125828 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/201fe27b-47b0-4b1d-89d5-fc37b565a76a-operator-scripts\") pod \"barbican-57b1-account-create-update-rgxfk\" (UID: \"201fe27b-47b0-4b1d-89d5-fc37b565a76a\") " pod="openstack/barbican-57b1-account-create-update-rgxfk" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.126021 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5-operator-scripts\") pod \"neutron-db-create-k2nnj\" (UID: \"f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5\") " pod="openstack/neutron-db-create-k2nnj" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.126099 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c7l4\" (UniqueName: \"kubernetes.io/projected/201fe27b-47b0-4b1d-89d5-fc37b565a76a-kube-api-access-9c7l4\") pod \"barbican-57b1-account-create-update-rgxfk\" (UID: \"201fe27b-47b0-4b1d-89d5-fc37b565a76a\") " pod="openstack/barbican-57b1-account-create-update-rgxfk" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.126205 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgjlj\" (UniqueName: \"kubernetes.io/projected/f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5-kube-api-access-mgjlj\") pod \"neutron-db-create-k2nnj\" (UID: \"f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5\") " pod="openstack/neutron-db-create-k2nnj" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.126235 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949c94b8-282b-40f0-bba5-3865562af774-operator-scripts\") pod \"neutron-f965-account-create-update-mfv8t\" (UID: \"949c94b8-282b-40f0-bba5-3865562af774\") " pod="openstack/neutron-f965-account-create-update-mfv8t" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.126295 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kn2m\" (UniqueName: \"kubernetes.io/projected/949c94b8-282b-40f0-bba5-3865562af774-kube-api-access-2kn2m\") pod \"neutron-f965-account-create-update-mfv8t\" (UID: \"949c94b8-282b-40f0-bba5-3865562af774\") " pod="openstack/neutron-f965-account-create-update-mfv8t" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.127002 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949c94b8-282b-40f0-bba5-3865562af774-operator-scripts\") pod \"neutron-f965-account-create-update-mfv8t\" (UID: \"949c94b8-282b-40f0-bba5-3865562af774\") " pod="openstack/neutron-f965-account-create-update-mfv8t" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.133183 4742 generic.go:334] "Generic (PLEG): container finished" podID="1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" containerID="f62f2d9b2ea480fb5376fcc72cb6afe1ec837e34900114bff50803a1ce8cbc20" exitCode=0 Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.133230 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" event={"ID":"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c","Type":"ContainerDied","Data":"f62f2d9b2ea480fb5376fcc72cb6afe1ec837e34900114bff50803a1ce8cbc20"} Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.133259 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" event={"ID":"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c","Type":"ContainerStarted","Data":"09d58755e6c9b6d33b0b6d038e61f810e1ca1daee640d3c8eb71418907824ad8"} Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.135285 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204cceda-eecf-48b2-b808-d2981ea6f0be-combined-ca-bundle\") pod \"keystone-db-sync-9srm9\" (UID: \"204cceda-eecf-48b2-b808-d2981ea6f0be\") " pod="openstack/keystone-db-sync-9srm9" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.144500 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcxvg\" (UniqueName: \"kubernetes.io/projected/204cceda-eecf-48b2-b808-d2981ea6f0be-kube-api-access-qcxvg\") pod \"keystone-db-sync-9srm9\" (UID: \"204cceda-eecf-48b2-b808-d2981ea6f0be\") " pod="openstack/keystone-db-sync-9srm9" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.155065 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204cceda-eecf-48b2-b808-d2981ea6f0be-config-data\") pod \"keystone-db-sync-9srm9\" (UID: \"204cceda-eecf-48b2-b808-d2981ea6f0be\") " pod="openstack/keystone-db-sync-9srm9" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.155565 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kn2m\" (UniqueName: \"kubernetes.io/projected/949c94b8-282b-40f0-bba5-3865562af774-kube-api-access-2kn2m\") pod \"neutron-f965-account-create-update-mfv8t\" (UID: \"949c94b8-282b-40f0-bba5-3865562af774\") " pod="openstack/neutron-f965-account-create-update-mfv8t" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.156579 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-wdvns" podStartSLOduration=6.156568983 podStartE2EDuration="6.156568983s" podCreationTimestamp="2026-03-17 11:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:10.155517403 +0000 UTC m=+1233.281645161" watchObservedRunningTime="2026-03-17 11:32:10.156568983 +0000 UTC m=+1233.282696741" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.227700 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgjlj\" (UniqueName: \"kubernetes.io/projected/f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5-kube-api-access-mgjlj\") pod \"neutron-db-create-k2nnj\" (UID: \"f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5\") " pod="openstack/neutron-db-create-k2nnj" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.227800 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/201fe27b-47b0-4b1d-89d5-fc37b565a76a-operator-scripts\") pod \"barbican-57b1-account-create-update-rgxfk\" (UID: \"201fe27b-47b0-4b1d-89d5-fc37b565a76a\") " pod="openstack/barbican-57b1-account-create-update-rgxfk" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.227882 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5-operator-scripts\") pod \"neutron-db-create-k2nnj\" (UID: \"f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5\") " pod="openstack/neutron-db-create-k2nnj" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.227928 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c7l4\" (UniqueName: \"kubernetes.io/projected/201fe27b-47b0-4b1d-89d5-fc37b565a76a-kube-api-access-9c7l4\") pod \"barbican-57b1-account-create-update-rgxfk\" (UID: \"201fe27b-47b0-4b1d-89d5-fc37b565a76a\") " pod="openstack/barbican-57b1-account-create-update-rgxfk" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.228666 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/201fe27b-47b0-4b1d-89d5-fc37b565a76a-operator-scripts\") pod \"barbican-57b1-account-create-update-rgxfk\" (UID: \"201fe27b-47b0-4b1d-89d5-fc37b565a76a\") " pod="openstack/barbican-57b1-account-create-update-rgxfk" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.228844 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5-operator-scripts\") pod \"neutron-db-create-k2nnj\" (UID: \"f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5\") " pod="openstack/neutron-db-create-k2nnj" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.232486 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dhlk7" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.268685 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgjlj\" (UniqueName: \"kubernetes.io/projected/f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5-kube-api-access-mgjlj\") pod \"neutron-db-create-k2nnj\" (UID: \"f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5\") " pod="openstack/neutron-db-create-k2nnj" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.268787 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c7l4\" (UniqueName: \"kubernetes.io/projected/201fe27b-47b0-4b1d-89d5-fc37b565a76a-kube-api-access-9c7l4\") pod \"barbican-57b1-account-create-update-rgxfk\" (UID: \"201fe27b-47b0-4b1d-89d5-fc37b565a76a\") " pod="openstack/barbican-57b1-account-create-update-rgxfk" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.316215 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9srm9" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.324308 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f965-account-create-update-mfv8t" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.348817 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7kc77"] Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.361324 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7kc77"] Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.370028 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dps6c"] Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.416547 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k2nnj" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.426420 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-57b1-account-create-update-rgxfk" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.644494 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5f9b-account-create-update-7kbcm"] Mar 17 11:32:10 crc kubenswrapper[4742]: W0317 11:32:10.650749 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45ef3446_d4b4_42dd_862f_0e4c548a9752.slice/crio-b8e8e21e09b68e0ef27ce907a09d21b56ff2d80a47d834ac5a0a1c03c0cac35a WatchSource:0}: Error finding container b8e8e21e09b68e0ef27ce907a09d21b56ff2d80a47d834ac5a0a1c03c0cac35a: Status 404 returned error can't find the container with id b8e8e21e09b68e0ef27ce907a09d21b56ff2d80a47d834ac5a0a1c03c0cac35a Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.673260 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cfb2b46-5e43-4332-901d-85ca6fbc2ad8" path="/var/lib/kubelet/pods/4cfb2b46-5e43-4332-901d-85ca6fbc2ad8/volumes" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.775326 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.839483 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-dns-svc\") pod \"f4c5f514-3700-48be-bc71-77a939cb171e\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.839736 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-dns-swift-storage-0\") pod \"f4c5f514-3700-48be-bc71-77a939cb171e\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.839858 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-ovsdbserver-sb\") pod \"f4c5f514-3700-48be-bc71-77a939cb171e\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.839970 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-config\") pod \"f4c5f514-3700-48be-bc71-77a939cb171e\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.840046 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rws4j\" (UniqueName: \"kubernetes.io/projected/f4c5f514-3700-48be-bc71-77a939cb171e-kube-api-access-rws4j\") pod \"f4c5f514-3700-48be-bc71-77a939cb171e\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.840181 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-ovsdbserver-nb\") pod \"f4c5f514-3700-48be-bc71-77a939cb171e\" (UID: \"f4c5f514-3700-48be-bc71-77a939cb171e\") " Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.852774 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4c5f514-3700-48be-bc71-77a939cb171e-kube-api-access-rws4j" (OuterVolumeSpecName: "kube-api-access-rws4j") pod "f4c5f514-3700-48be-bc71-77a939cb171e" (UID: "f4c5f514-3700-48be-bc71-77a939cb171e"). InnerVolumeSpecName "kube-api-access-rws4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.894147 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-dhlk7"] Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.924969 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f4c5f514-3700-48be-bc71-77a939cb171e" (UID: "f4c5f514-3700-48be-bc71-77a939cb171e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.941980 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f4c5f514-3700-48be-bc71-77a939cb171e" (UID: "f4c5f514-3700-48be-bc71-77a939cb171e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.948016 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.948046 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.948059 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rws4j\" (UniqueName: \"kubernetes.io/projected/f4c5f514-3700-48be-bc71-77a939cb171e-kube-api-access-rws4j\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.948407 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f4c5f514-3700-48be-bc71-77a939cb171e" (UID: "f4c5f514-3700-48be-bc71-77a939cb171e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.951343 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-config" (OuterVolumeSpecName: "config") pod "f4c5f514-3700-48be-bc71-77a939cb171e" (UID: "f4c5f514-3700-48be-bc71-77a939cb171e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:10 crc kubenswrapper[4742]: I0317 11:32:10.953170 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f4c5f514-3700-48be-bc71-77a939cb171e" (UID: "f4c5f514-3700-48be-bc71-77a939cb171e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.027703 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f965-account-create-update-mfv8t"] Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.038382 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9srm9"] Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.049246 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.049266 4742 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.049276 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4c5f514-3700-48be-bc71-77a939cb171e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.051472 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-57b1-account-create-update-rgxfk"] Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.148206 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f965-account-create-update-mfv8t" event={"ID":"949c94b8-282b-40f0-bba5-3865562af774","Type":"ContainerStarted","Data":"2cb39793f3197a28b9d696b9cd2e310c06d6d42f57599780bd4791b178244ef7"} Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.150929 4742 generic.go:334] "Generic (PLEG): container finished" podID="1e82963e-fa88-4b3c-847c-4fc0976e63b0" containerID="f94bad1cff14b843d25f11d02ce6595ca41b53fd50bc7ce6eb7e828abbefcb08" exitCode=0 Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.151030 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dps6c" event={"ID":"1e82963e-fa88-4b3c-847c-4fc0976e63b0","Type":"ContainerDied","Data":"f94bad1cff14b843d25f11d02ce6595ca41b53fd50bc7ce6eb7e828abbefcb08"} Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.151060 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dps6c" event={"ID":"1e82963e-fa88-4b3c-847c-4fc0976e63b0","Type":"ContainerStarted","Data":"ff24ef5e6561c921d7d408d03d5dce79a142c7f3c503b94d8ded5576980254b9"} Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.155374 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dhlk7" event={"ID":"d11a1d54-dde6-466c-a500-72fd1c349db3","Type":"ContainerStarted","Data":"24a13e72d335dd32d2187c0d392b6bd7f2a5db45d16aa6701704385c4adfbf83"} Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.159694 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9srm9" event={"ID":"204cceda-eecf-48b2-b808-d2981ea6f0be","Type":"ContainerStarted","Data":"d3533f0fe40e99948ce905a25bc9d84eb337e7938ecaf22d56159bff6a01c4c4"} Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.169278 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" event={"ID":"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c","Type":"ContainerStarted","Data":"821f7b0c9ffb1c5b507d88792fc4fdc096a2cfb7a0a31e127fe30419b71a81af"} Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.169546 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.180313 4742 generic.go:334] "Generic (PLEG): container finished" podID="f4c5f514-3700-48be-bc71-77a939cb171e" containerID="f9ccc874c4ceab9534d664f1ca6ea21b9c71429cd16a0a0e3849b0d8c8a1e15d" exitCode=0 Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.180431 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-wdvns" event={"ID":"f4c5f514-3700-48be-bc71-77a939cb171e","Type":"ContainerDied","Data":"f9ccc874c4ceab9534d664f1ca6ea21b9c71429cd16a0a0e3849b0d8c8a1e15d"} Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.180473 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-wdvns" event={"ID":"f4c5f514-3700-48be-bc71-77a939cb171e","Type":"ContainerDied","Data":"ca2347ec0115e06573dc3ada26aa0937d075479bd71a1c014e0eb82ef7b9bb97"} Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.180495 4742 scope.go:117] "RemoveContainer" containerID="f9ccc874c4ceab9534d664f1ca6ea21b9c71429cd16a0a0e3849b0d8c8a1e15d" Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.180687 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-wdvns" Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.188793 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5f9b-account-create-update-7kbcm" event={"ID":"45ef3446-d4b4-42dd-862f-0e4c548a9752","Type":"ContainerStarted","Data":"27c73a6430d4129faf969c37f4087e8e8e6e9df6cb6e5b19aa3c6eef621646ff"} Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.188836 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5f9b-account-create-update-7kbcm" event={"ID":"45ef3446-d4b4-42dd-862f-0e4c548a9752","Type":"ContainerStarted","Data":"b8e8e21e09b68e0ef27ce907a09d21b56ff2d80a47d834ac5a0a1c03c0cac35a"} Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.191118 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-57b1-account-create-update-rgxfk" event={"ID":"201fe27b-47b0-4b1d-89d5-fc37b565a76a","Type":"ContainerStarted","Data":"e0bf66b07bc9cff2aafd345d21f76719126a6a5a91d55b3f09b913a52cfc6c7e"} Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.199021 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" podStartSLOduration=5.199003638 podStartE2EDuration="5.199003638s" podCreationTimestamp="2026-03-17 11:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:11.197326842 +0000 UTC m=+1234.323454600" watchObservedRunningTime="2026-03-17 11:32:11.199003638 +0000 UTC m=+1234.325131396" Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.219044 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-k2nnj"] Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.238203 4742 scope.go:117] "RemoveContainer" containerID="92d0f99375060c4cdb285c1c59463e9edd7cbcfb69ef483440d13c42d590a10e" Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.247327 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-wdvns"] Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.253468 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-wdvns"] Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.281391 4742 scope.go:117] "RemoveContainer" containerID="f9ccc874c4ceab9534d664f1ca6ea21b9c71429cd16a0a0e3849b0d8c8a1e15d" Mar 17 11:32:11 crc kubenswrapper[4742]: E0317 11:32:11.283084 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ccc874c4ceab9534d664f1ca6ea21b9c71429cd16a0a0e3849b0d8c8a1e15d\": container with ID starting with f9ccc874c4ceab9534d664f1ca6ea21b9c71429cd16a0a0e3849b0d8c8a1e15d not found: ID does not exist" containerID="f9ccc874c4ceab9534d664f1ca6ea21b9c71429cd16a0a0e3849b0d8c8a1e15d" Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.283122 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ccc874c4ceab9534d664f1ca6ea21b9c71429cd16a0a0e3849b0d8c8a1e15d"} err="failed to get container status \"f9ccc874c4ceab9534d664f1ca6ea21b9c71429cd16a0a0e3849b0d8c8a1e15d\": rpc error: code = NotFound desc = could not find container \"f9ccc874c4ceab9534d664f1ca6ea21b9c71429cd16a0a0e3849b0d8c8a1e15d\": container with ID starting with f9ccc874c4ceab9534d664f1ca6ea21b9c71429cd16a0a0e3849b0d8c8a1e15d not found: ID does not exist" Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.283145 4742 scope.go:117] "RemoveContainer" containerID="92d0f99375060c4cdb285c1c59463e9edd7cbcfb69ef483440d13c42d590a10e" Mar 17 11:32:11 crc kubenswrapper[4742]: E0317 11:32:11.283542 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92d0f99375060c4cdb285c1c59463e9edd7cbcfb69ef483440d13c42d590a10e\": container with ID starting with 92d0f99375060c4cdb285c1c59463e9edd7cbcfb69ef483440d13c42d590a10e not found: ID does not exist" containerID="92d0f99375060c4cdb285c1c59463e9edd7cbcfb69ef483440d13c42d590a10e" Mar 17 11:32:11 crc kubenswrapper[4742]: I0317 11:32:11.283588 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92d0f99375060c4cdb285c1c59463e9edd7cbcfb69ef483440d13c42d590a10e"} err="failed to get container status \"92d0f99375060c4cdb285c1c59463e9edd7cbcfb69ef483440d13c42d590a10e\": rpc error: code = NotFound desc = could not find container \"92d0f99375060c4cdb285c1c59463e9edd7cbcfb69ef483440d13c42d590a10e\": container with ID starting with 92d0f99375060c4cdb285c1c59463e9edd7cbcfb69ef483440d13c42d590a10e not found: ID does not exist" Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.203179 4742 generic.go:334] "Generic (PLEG): container finished" podID="45ef3446-d4b4-42dd-862f-0e4c548a9752" containerID="27c73a6430d4129faf969c37f4087e8e8e6e9df6cb6e5b19aa3c6eef621646ff" exitCode=0 Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.203240 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5f9b-account-create-update-7kbcm" event={"ID":"45ef3446-d4b4-42dd-862f-0e4c548a9752","Type":"ContainerDied","Data":"27c73a6430d4129faf969c37f4087e8e8e6e9df6cb6e5b19aa3c6eef621646ff"} Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.205257 4742 generic.go:334] "Generic (PLEG): container finished" podID="201fe27b-47b0-4b1d-89d5-fc37b565a76a" containerID="5617aca000aa7b5e6311644bc6b1a093cfe8e1a8ae9ba31c7befacd5b573b699" exitCode=0 Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.205344 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-57b1-account-create-update-rgxfk" event={"ID":"201fe27b-47b0-4b1d-89d5-fc37b565a76a","Type":"ContainerDied","Data":"5617aca000aa7b5e6311644bc6b1a093cfe8e1a8ae9ba31c7befacd5b573b699"} Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.208244 4742 generic.go:334] "Generic (PLEG): container finished" podID="949c94b8-282b-40f0-bba5-3865562af774" containerID="4ef659e0fc73686bdc219e2b805c8ba4347bbc4acfb2982923291e175e6e71cc" exitCode=0 Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.208303 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f965-account-create-update-mfv8t" event={"ID":"949c94b8-282b-40f0-bba5-3865562af774","Type":"ContainerDied","Data":"4ef659e0fc73686bdc219e2b805c8ba4347bbc4acfb2982923291e175e6e71cc"} Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.210628 4742 generic.go:334] "Generic (PLEG): container finished" podID="d11a1d54-dde6-466c-a500-72fd1c349db3" containerID="9bd0a349b324af024e2ad70b8b129ae444cc251b8dc8929d55576fc49d4c0adc" exitCode=0 Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.210663 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dhlk7" event={"ID":"d11a1d54-dde6-466c-a500-72fd1c349db3","Type":"ContainerDied","Data":"9bd0a349b324af024e2ad70b8b129ae444cc251b8dc8929d55576fc49d4c0adc"} Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.213132 4742 generic.go:334] "Generic (PLEG): container finished" podID="f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5" containerID="42c04c90a6c9e5fa259d3167072073f029de3a0b987f1ee39c7d23b0648f2a34" exitCode=0 Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.213682 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-k2nnj" event={"ID":"f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5","Type":"ContainerDied","Data":"42c04c90a6c9e5fa259d3167072073f029de3a0b987f1ee39c7d23b0648f2a34"} Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.213715 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-k2nnj" event={"ID":"f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5","Type":"ContainerStarted","Data":"616ef5633690c38b6a8921ab148316dc06f677459a54aa81bf440103ff020f5a"} Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.666314 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5f9b-account-create-update-7kbcm" Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.672140 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dps6c" Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.686423 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4c5f514-3700-48be-bc71-77a939cb171e" path="/var/lib/kubelet/pods/f4c5f514-3700-48be-bc71-77a939cb171e/volumes" Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.782527 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g552g\" (UniqueName: \"kubernetes.io/projected/45ef3446-d4b4-42dd-862f-0e4c548a9752-kube-api-access-g552g\") pod \"45ef3446-d4b4-42dd-862f-0e4c548a9752\" (UID: \"45ef3446-d4b4-42dd-862f-0e4c548a9752\") " Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.782620 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-674qz\" (UniqueName: \"kubernetes.io/projected/1e82963e-fa88-4b3c-847c-4fc0976e63b0-kube-api-access-674qz\") pod \"1e82963e-fa88-4b3c-847c-4fc0976e63b0\" (UID: \"1e82963e-fa88-4b3c-847c-4fc0976e63b0\") " Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.782672 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ef3446-d4b4-42dd-862f-0e4c548a9752-operator-scripts\") pod \"45ef3446-d4b4-42dd-862f-0e4c548a9752\" (UID: \"45ef3446-d4b4-42dd-862f-0e4c548a9752\") " Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.782788 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e82963e-fa88-4b3c-847c-4fc0976e63b0-operator-scripts\") pod \"1e82963e-fa88-4b3c-847c-4fc0976e63b0\" (UID: \"1e82963e-fa88-4b3c-847c-4fc0976e63b0\") " Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.783443 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e82963e-fa88-4b3c-847c-4fc0976e63b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e82963e-fa88-4b3c-847c-4fc0976e63b0" (UID: "1e82963e-fa88-4b3c-847c-4fc0976e63b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.783441 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ef3446-d4b4-42dd-862f-0e4c548a9752-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45ef3446-d4b4-42dd-862f-0e4c548a9752" (UID: "45ef3446-d4b4-42dd-862f-0e4c548a9752"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.789011 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ef3446-d4b4-42dd-862f-0e4c548a9752-kube-api-access-g552g" (OuterVolumeSpecName: "kube-api-access-g552g") pod "45ef3446-d4b4-42dd-862f-0e4c548a9752" (UID: "45ef3446-d4b4-42dd-862f-0e4c548a9752"). InnerVolumeSpecName "kube-api-access-g552g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.789230 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e82963e-fa88-4b3c-847c-4fc0976e63b0-kube-api-access-674qz" (OuterVolumeSpecName: "kube-api-access-674qz") pod "1e82963e-fa88-4b3c-847c-4fc0976e63b0" (UID: "1e82963e-fa88-4b3c-847c-4fc0976e63b0"). InnerVolumeSpecName "kube-api-access-674qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.885042 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e82963e-fa88-4b3c-847c-4fc0976e63b0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.885074 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g552g\" (UniqueName: \"kubernetes.io/projected/45ef3446-d4b4-42dd-862f-0e4c548a9752-kube-api-access-g552g\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.885091 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-674qz\" (UniqueName: \"kubernetes.io/projected/1e82963e-fa88-4b3c-847c-4fc0976e63b0-kube-api-access-674qz\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:12 crc kubenswrapper[4742]: I0317 11:32:12.885101 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ef3446-d4b4-42dd-862f-0e4c548a9752-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:13 crc kubenswrapper[4742]: I0317 11:32:13.228264 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dps6c" event={"ID":"1e82963e-fa88-4b3c-847c-4fc0976e63b0","Type":"ContainerDied","Data":"ff24ef5e6561c921d7d408d03d5dce79a142c7f3c503b94d8ded5576980254b9"} Mar 17 11:32:13 crc kubenswrapper[4742]: I0317 11:32:13.228313 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff24ef5e6561c921d7d408d03d5dce79a142c7f3c503b94d8ded5576980254b9" Mar 17 11:32:13 crc kubenswrapper[4742]: I0317 11:32:13.228402 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dps6c" Mar 17 11:32:13 crc kubenswrapper[4742]: I0317 11:32:13.234975 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5f9b-account-create-update-7kbcm" event={"ID":"45ef3446-d4b4-42dd-862f-0e4c548a9752","Type":"ContainerDied","Data":"b8e8e21e09b68e0ef27ce907a09d21b56ff2d80a47d834ac5a0a1c03c0cac35a"} Mar 17 11:32:13 crc kubenswrapper[4742]: I0317 11:32:13.235003 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8e8e21e09b68e0ef27ce907a09d21b56ff2d80a47d834ac5a0a1c03c0cac35a" Mar 17 11:32:13 crc kubenswrapper[4742]: I0317 11:32:13.235004 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5f9b-account-create-update-7kbcm" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.114717 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-94v9f"] Mar 17 11:32:14 crc kubenswrapper[4742]: E0317 11:32:14.115428 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e82963e-fa88-4b3c-847c-4fc0976e63b0" containerName="mariadb-database-create" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.115450 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e82963e-fa88-4b3c-847c-4fc0976e63b0" containerName="mariadb-database-create" Mar 17 11:32:14 crc kubenswrapper[4742]: E0317 11:32:14.115472 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ef3446-d4b4-42dd-862f-0e4c548a9752" containerName="mariadb-account-create-update" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.115481 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ef3446-d4b4-42dd-862f-0e4c548a9752" containerName="mariadb-account-create-update" Mar 17 11:32:14 crc kubenswrapper[4742]: E0317 11:32:14.115496 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c5f514-3700-48be-bc71-77a939cb171e" containerName="dnsmasq-dns" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.115504 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c5f514-3700-48be-bc71-77a939cb171e" containerName="dnsmasq-dns" Mar 17 11:32:14 crc kubenswrapper[4742]: E0317 11:32:14.115518 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c5f514-3700-48be-bc71-77a939cb171e" containerName="init" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.115525 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c5f514-3700-48be-bc71-77a939cb171e" containerName="init" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.115873 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ef3446-d4b4-42dd-862f-0e4c548a9752" containerName="mariadb-account-create-update" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.115924 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c5f514-3700-48be-bc71-77a939cb171e" containerName="dnsmasq-dns" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.115934 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e82963e-fa88-4b3c-847c-4fc0976e63b0" containerName="mariadb-database-create" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.116568 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-94v9f" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.124181 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.134868 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-94v9f"] Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.215715 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb9bc7db-7919-4ec4-b316-e8aaeb69b05d-operator-scripts\") pod \"root-account-create-update-94v9f\" (UID: \"bb9bc7db-7919-4ec4-b316-e8aaeb69b05d\") " pod="openstack/root-account-create-update-94v9f" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.216309 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsvwp\" (UniqueName: \"kubernetes.io/projected/bb9bc7db-7919-4ec4-b316-e8aaeb69b05d-kube-api-access-bsvwp\") pod \"root-account-create-update-94v9f\" (UID: \"bb9bc7db-7919-4ec4-b316-e8aaeb69b05d\") " pod="openstack/root-account-create-update-94v9f" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.318177 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsvwp\" (UniqueName: \"kubernetes.io/projected/bb9bc7db-7919-4ec4-b316-e8aaeb69b05d-kube-api-access-bsvwp\") pod \"root-account-create-update-94v9f\" (UID: \"bb9bc7db-7919-4ec4-b316-e8aaeb69b05d\") " pod="openstack/root-account-create-update-94v9f" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.319332 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb9bc7db-7919-4ec4-b316-e8aaeb69b05d-operator-scripts\") pod \"root-account-create-update-94v9f\" (UID: \"bb9bc7db-7919-4ec4-b316-e8aaeb69b05d\") " pod="openstack/root-account-create-update-94v9f" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.320517 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb9bc7db-7919-4ec4-b316-e8aaeb69b05d-operator-scripts\") pod \"root-account-create-update-94v9f\" (UID: \"bb9bc7db-7919-4ec4-b316-e8aaeb69b05d\") " pod="openstack/root-account-create-update-94v9f" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.339120 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsvwp\" (UniqueName: \"kubernetes.io/projected/bb9bc7db-7919-4ec4-b316-e8aaeb69b05d-kube-api-access-bsvwp\") pod \"root-account-create-update-94v9f\" (UID: \"bb9bc7db-7919-4ec4-b316-e8aaeb69b05d\") " pod="openstack/root-account-create-update-94v9f" Mar 17 11:32:14 crc kubenswrapper[4742]: I0317 11:32:14.441634 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-94v9f" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.247232 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k2nnj" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.259549 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dhlk7" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.260022 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-dhlk7" event={"ID":"d11a1d54-dde6-466c-a500-72fd1c349db3","Type":"ContainerDied","Data":"24a13e72d335dd32d2187c0d392b6bd7f2a5db45d16aa6701704385c4adfbf83"} Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.260050 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24a13e72d335dd32d2187c0d392b6bd7f2a5db45d16aa6701704385c4adfbf83" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.264880 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-k2nnj" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.265386 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-k2nnj" event={"ID":"f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5","Type":"ContainerDied","Data":"616ef5633690c38b6a8921ab148316dc06f677459a54aa81bf440103ff020f5a"} Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.265415 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="616ef5633690c38b6a8921ab148316dc06f677459a54aa81bf440103ff020f5a" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.270145 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-57b1-account-create-update-rgxfk" event={"ID":"201fe27b-47b0-4b1d-89d5-fc37b565a76a","Type":"ContainerDied","Data":"e0bf66b07bc9cff2aafd345d21f76719126a6a5a91d55b3f09b913a52cfc6c7e"} Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.270168 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0bf66b07bc9cff2aafd345d21f76719126a6a5a91d55b3f09b913a52cfc6c7e" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.275193 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f965-account-create-update-mfv8t" event={"ID":"949c94b8-282b-40f0-bba5-3865562af774","Type":"ContainerDied","Data":"2cb39793f3197a28b9d696b9cd2e310c06d6d42f57599780bd4791b178244ef7"} Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.275230 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cb39793f3197a28b9d696b9cd2e310c06d6d42f57599780bd4791b178244ef7" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.300963 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-57b1-account-create-update-rgxfk" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.324334 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f965-account-create-update-mfv8t" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.353296 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5-operator-scripts\") pod \"f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5\" (UID: \"f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5\") " Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.353340 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgjlj\" (UniqueName: \"kubernetes.io/projected/f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5-kube-api-access-mgjlj\") pod \"f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5\" (UID: \"f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5\") " Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.353367 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d11a1d54-dde6-466c-a500-72fd1c349db3-operator-scripts\") pod \"d11a1d54-dde6-466c-a500-72fd1c349db3\" (UID: \"d11a1d54-dde6-466c-a500-72fd1c349db3\") " Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.353438 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c7l4\" (UniqueName: \"kubernetes.io/projected/201fe27b-47b0-4b1d-89d5-fc37b565a76a-kube-api-access-9c7l4\") pod \"201fe27b-47b0-4b1d-89d5-fc37b565a76a\" (UID: \"201fe27b-47b0-4b1d-89d5-fc37b565a76a\") " Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.353457 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scq5v\" (UniqueName: \"kubernetes.io/projected/d11a1d54-dde6-466c-a500-72fd1c349db3-kube-api-access-scq5v\") pod \"d11a1d54-dde6-466c-a500-72fd1c349db3\" (UID: \"d11a1d54-dde6-466c-a500-72fd1c349db3\") " Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.353483 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/201fe27b-47b0-4b1d-89d5-fc37b565a76a-operator-scripts\") pod \"201fe27b-47b0-4b1d-89d5-fc37b565a76a\" (UID: \"201fe27b-47b0-4b1d-89d5-fc37b565a76a\") " Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.354361 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d11a1d54-dde6-466c-a500-72fd1c349db3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d11a1d54-dde6-466c-a500-72fd1c349db3" (UID: "d11a1d54-dde6-466c-a500-72fd1c349db3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.354366 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/201fe27b-47b0-4b1d-89d5-fc37b565a76a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "201fe27b-47b0-4b1d-89d5-fc37b565a76a" (UID: "201fe27b-47b0-4b1d-89d5-fc37b565a76a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.354737 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5" (UID: "f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.358070 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/201fe27b-47b0-4b1d-89d5-fc37b565a76a-kube-api-access-9c7l4" (OuterVolumeSpecName: "kube-api-access-9c7l4") pod "201fe27b-47b0-4b1d-89d5-fc37b565a76a" (UID: "201fe27b-47b0-4b1d-89d5-fc37b565a76a"). InnerVolumeSpecName "kube-api-access-9c7l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.359071 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11a1d54-dde6-466c-a500-72fd1c349db3-kube-api-access-scq5v" (OuterVolumeSpecName: "kube-api-access-scq5v") pod "d11a1d54-dde6-466c-a500-72fd1c349db3" (UID: "d11a1d54-dde6-466c-a500-72fd1c349db3"). InnerVolumeSpecName "kube-api-access-scq5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.361105 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5-kube-api-access-mgjlj" (OuterVolumeSpecName: "kube-api-access-mgjlj") pod "f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5" (UID: "f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5"). InnerVolumeSpecName "kube-api-access-mgjlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.454386 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kn2m\" (UniqueName: \"kubernetes.io/projected/949c94b8-282b-40f0-bba5-3865562af774-kube-api-access-2kn2m\") pod \"949c94b8-282b-40f0-bba5-3865562af774\" (UID: \"949c94b8-282b-40f0-bba5-3865562af774\") " Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.454459 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949c94b8-282b-40f0-bba5-3865562af774-operator-scripts\") pod \"949c94b8-282b-40f0-bba5-3865562af774\" (UID: \"949c94b8-282b-40f0-bba5-3865562af774\") " Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.454672 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c7l4\" (UniqueName: \"kubernetes.io/projected/201fe27b-47b0-4b1d-89d5-fc37b565a76a-kube-api-access-9c7l4\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.454683 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scq5v\" (UniqueName: \"kubernetes.io/projected/d11a1d54-dde6-466c-a500-72fd1c349db3-kube-api-access-scq5v\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.454692 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/201fe27b-47b0-4b1d-89d5-fc37b565a76a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.454700 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.454707 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgjlj\" (UniqueName: \"kubernetes.io/projected/f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5-kube-api-access-mgjlj\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.454717 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d11a1d54-dde6-466c-a500-72fd1c349db3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.455322 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949c94b8-282b-40f0-bba5-3865562af774-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "949c94b8-282b-40f0-bba5-3865562af774" (UID: "949c94b8-282b-40f0-bba5-3865562af774"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.457448 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949c94b8-282b-40f0-bba5-3865562af774-kube-api-access-2kn2m" (OuterVolumeSpecName: "kube-api-access-2kn2m") pod "949c94b8-282b-40f0-bba5-3865562af774" (UID: "949c94b8-282b-40f0-bba5-3865562af774"). InnerVolumeSpecName "kube-api-access-2kn2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.557473 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kn2m\" (UniqueName: \"kubernetes.io/projected/949c94b8-282b-40f0-bba5-3865562af774-kube-api-access-2kn2m\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.557529 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/949c94b8-282b-40f0-bba5-3865562af774-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.607680 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-94v9f"] Mar 17 11:32:16 crc kubenswrapper[4742]: W0317 11:32:16.623773 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb9bc7db_7919_4ec4_b316_e8aaeb69b05d.slice/crio-45a02c9b7864a4b9236f9ba31f669984ceb197ebe659aca02affd75192a9171e WatchSource:0}: Error finding container 45a02c9b7864a4b9236f9ba31f669984ceb197ebe659aca02affd75192a9171e: Status 404 returned error can't find the container with id 45a02c9b7864a4b9236f9ba31f669984ceb197ebe659aca02affd75192a9171e Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.798451 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.862621 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-ldqhg"] Mar 17 11:32:16 crc kubenswrapper[4742]: I0317 11:32:16.862926 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-ldqhg" podUID="d5b7d712-b6f0-43e1-a95b-e49251608407" containerName="dnsmasq-dns" containerID="cri-o://132e6829f0471b35d024d8b51add4272475ab6913eef83c75aa27597272f3deb" gracePeriod=10 Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.312441 4742 generic.go:334] "Generic (PLEG): container finished" podID="d5b7d712-b6f0-43e1-a95b-e49251608407" containerID="132e6829f0471b35d024d8b51add4272475ab6913eef83c75aa27597272f3deb" exitCode=0 Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.312541 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-ldqhg" event={"ID":"d5b7d712-b6f0-43e1-a95b-e49251608407","Type":"ContainerDied","Data":"132e6829f0471b35d024d8b51add4272475ab6913eef83c75aa27597272f3deb"} Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.312767 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-ldqhg" event={"ID":"d5b7d712-b6f0-43e1-a95b-e49251608407","Type":"ContainerDied","Data":"679b8cc74a75413f4ed61f065f1a70f325cc700c29f59c2bc94c38e560c74dea"} Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.312784 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="679b8cc74a75413f4ed61f065f1a70f325cc700c29f59c2bc94c38e560c74dea" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.314621 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9srm9" event={"ID":"204cceda-eecf-48b2-b808-d2981ea6f0be","Type":"ContainerStarted","Data":"5dbd54130eda4b981d57f4e9a4456c069dd650f871865dfc3219983e4cc0313b"} Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.316629 4742 generic.go:334] "Generic (PLEG): container finished" podID="bb9bc7db-7919-4ec4-b316-e8aaeb69b05d" containerID="a39d766731a30ce0525a21395e6a75060ba4e8d7e54fa71ec5505353c307d099" exitCode=0 Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.316709 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f965-account-create-update-mfv8t" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.316774 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-94v9f" event={"ID":"bb9bc7db-7919-4ec4-b316-e8aaeb69b05d","Type":"ContainerDied","Data":"a39d766731a30ce0525a21395e6a75060ba4e8d7e54fa71ec5505353c307d099"} Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.316972 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-94v9f" event={"ID":"bb9bc7db-7919-4ec4-b316-e8aaeb69b05d","Type":"ContainerStarted","Data":"45a02c9b7864a4b9236f9ba31f669984ceb197ebe659aca02affd75192a9171e"} Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.317057 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-57b1-account-create-update-rgxfk" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.317095 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-dhlk7" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.346081 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-9srm9" podStartSLOduration=3.257421151 podStartE2EDuration="8.346058181s" podCreationTimestamp="2026-03-17 11:32:09 +0000 UTC" firstStartedPulling="2026-03-17 11:32:11.065234623 +0000 UTC m=+1234.191362381" lastFinishedPulling="2026-03-17 11:32:16.153871653 +0000 UTC m=+1239.279999411" observedRunningTime="2026-03-17 11:32:17.334709434 +0000 UTC m=+1240.460837202" watchObservedRunningTime="2026-03-17 11:32:17.346058181 +0000 UTC m=+1240.472185939" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.353198 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.473111 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-ovsdbserver-nb\") pod \"d5b7d712-b6f0-43e1-a95b-e49251608407\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.473426 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-dns-svc\") pod \"d5b7d712-b6f0-43e1-a95b-e49251608407\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.473501 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-config\") pod \"d5b7d712-b6f0-43e1-a95b-e49251608407\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.473574 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bftxm\" (UniqueName: \"kubernetes.io/projected/d5b7d712-b6f0-43e1-a95b-e49251608407-kube-api-access-bftxm\") pod \"d5b7d712-b6f0-43e1-a95b-e49251608407\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.473689 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-ovsdbserver-sb\") pod \"d5b7d712-b6f0-43e1-a95b-e49251608407\" (UID: \"d5b7d712-b6f0-43e1-a95b-e49251608407\") " Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.478845 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b7d712-b6f0-43e1-a95b-e49251608407-kube-api-access-bftxm" (OuterVolumeSpecName: "kube-api-access-bftxm") pod "d5b7d712-b6f0-43e1-a95b-e49251608407" (UID: "d5b7d712-b6f0-43e1-a95b-e49251608407"). InnerVolumeSpecName "kube-api-access-bftxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.524417 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-config" (OuterVolumeSpecName: "config") pod "d5b7d712-b6f0-43e1-a95b-e49251608407" (UID: "d5b7d712-b6f0-43e1-a95b-e49251608407"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.526323 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d5b7d712-b6f0-43e1-a95b-e49251608407" (UID: "d5b7d712-b6f0-43e1-a95b-e49251608407"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.527460 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5b7d712-b6f0-43e1-a95b-e49251608407" (UID: "d5b7d712-b6f0-43e1-a95b-e49251608407"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.531883 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5b7d712-b6f0-43e1-a95b-e49251608407" (UID: "d5b7d712-b6f0-43e1-a95b-e49251608407"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.575298 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.575456 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.575510 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.575557 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bftxm\" (UniqueName: \"kubernetes.io/projected/d5b7d712-b6f0-43e1-a95b-e49251608407-kube-api-access-bftxm\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:17 crc kubenswrapper[4742]: I0317 11:32:17.575655 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5b7d712-b6f0-43e1-a95b-e49251608407-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.043800 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.043888 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.044007 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.045096 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5ef1667f2e6dd9db693993b9f4f126e4ca6164458a0fe8e5b3f3f6b5159b8d2"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.045204 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://a5ef1667f2e6dd9db693993b9f4f126e4ca6164458a0fe8e5b3f3f6b5159b8d2" gracePeriod=600 Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.326688 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="a5ef1667f2e6dd9db693993b9f4f126e4ca6164458a0fe8e5b3f3f6b5159b8d2" exitCode=0 Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.327203 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-ldqhg" Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.326764 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"a5ef1667f2e6dd9db693993b9f4f126e4ca6164458a0fe8e5b3f3f6b5159b8d2"} Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.327320 4742 scope.go:117] "RemoveContainer" containerID="e970ab8ae9b7236a8af0e70d950c97f70be620ea87e4acbc181c30424216e493" Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.379517 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-ldqhg"] Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.390633 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-ldqhg"] Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.672760 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5b7d712-b6f0-43e1-a95b-e49251608407" path="/var/lib/kubelet/pods/d5b7d712-b6f0-43e1-a95b-e49251608407/volumes" Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.698352 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-94v9f" Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.799535 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsvwp\" (UniqueName: \"kubernetes.io/projected/bb9bc7db-7919-4ec4-b316-e8aaeb69b05d-kube-api-access-bsvwp\") pod \"bb9bc7db-7919-4ec4-b316-e8aaeb69b05d\" (UID: \"bb9bc7db-7919-4ec4-b316-e8aaeb69b05d\") " Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.799674 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb9bc7db-7919-4ec4-b316-e8aaeb69b05d-operator-scripts\") pod \"bb9bc7db-7919-4ec4-b316-e8aaeb69b05d\" (UID: \"bb9bc7db-7919-4ec4-b316-e8aaeb69b05d\") " Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.800604 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb9bc7db-7919-4ec4-b316-e8aaeb69b05d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb9bc7db-7919-4ec4-b316-e8aaeb69b05d" (UID: "bb9bc7db-7919-4ec4-b316-e8aaeb69b05d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.806114 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb9bc7db-7919-4ec4-b316-e8aaeb69b05d-kube-api-access-bsvwp" (OuterVolumeSpecName: "kube-api-access-bsvwp") pod "bb9bc7db-7919-4ec4-b316-e8aaeb69b05d" (UID: "bb9bc7db-7919-4ec4-b316-e8aaeb69b05d"). InnerVolumeSpecName "kube-api-access-bsvwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.901476 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsvwp\" (UniqueName: \"kubernetes.io/projected/bb9bc7db-7919-4ec4-b316-e8aaeb69b05d-kube-api-access-bsvwp\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.901519 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb9bc7db-7919-4ec4-b316-e8aaeb69b05d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:18 crc kubenswrapper[4742]: I0317 11:32:18.959087 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:32:19 crc kubenswrapper[4742]: I0317 11:32:19.335819 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-94v9f" event={"ID":"bb9bc7db-7919-4ec4-b316-e8aaeb69b05d","Type":"ContainerDied","Data":"45a02c9b7864a4b9236f9ba31f669984ceb197ebe659aca02affd75192a9171e"} Mar 17 11:32:19 crc kubenswrapper[4742]: I0317 11:32:19.335856 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45a02c9b7864a4b9236f9ba31f669984ceb197ebe659aca02affd75192a9171e" Mar 17 11:32:19 crc kubenswrapper[4742]: I0317 11:32:19.336276 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-94v9f" Mar 17 11:32:19 crc kubenswrapper[4742]: I0317 11:32:19.338292 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"1aeee9892509f65c6f012471968b84d5122ab43ea074794d2d7aecfdfae8d433"} Mar 17 11:32:20 crc kubenswrapper[4742]: I0317 11:32:20.351718 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-94v9f"] Mar 17 11:32:20 crc kubenswrapper[4742]: I0317 11:32:20.357601 4742 generic.go:334] "Generic (PLEG): container finished" podID="204cceda-eecf-48b2-b808-d2981ea6f0be" containerID="5dbd54130eda4b981d57f4e9a4456c069dd650f871865dfc3219983e4cc0313b" exitCode=0 Mar 17 11:32:20 crc kubenswrapper[4742]: I0317 11:32:20.357662 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9srm9" event={"ID":"204cceda-eecf-48b2-b808-d2981ea6f0be","Type":"ContainerDied","Data":"5dbd54130eda4b981d57f4e9a4456c069dd650f871865dfc3219983e4cc0313b"} Mar 17 11:32:20 crc kubenswrapper[4742]: I0317 11:32:20.364720 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-94v9f"] Mar 17 11:32:20 crc kubenswrapper[4742]: I0317 11:32:20.674458 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb9bc7db-7919-4ec4-b316-e8aaeb69b05d" path="/var/lib/kubelet/pods/bb9bc7db-7919-4ec4-b316-e8aaeb69b05d/volumes" Mar 17 11:32:21 crc kubenswrapper[4742]: I0317 11:32:21.691663 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9srm9" Mar 17 11:32:21 crc kubenswrapper[4742]: I0317 11:32:21.744540 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204cceda-eecf-48b2-b808-d2981ea6f0be-combined-ca-bundle\") pod \"204cceda-eecf-48b2-b808-d2981ea6f0be\" (UID: \"204cceda-eecf-48b2-b808-d2981ea6f0be\") " Mar 17 11:32:21 crc kubenswrapper[4742]: I0317 11:32:21.744814 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204cceda-eecf-48b2-b808-d2981ea6f0be-config-data\") pod \"204cceda-eecf-48b2-b808-d2981ea6f0be\" (UID: \"204cceda-eecf-48b2-b808-d2981ea6f0be\") " Mar 17 11:32:21 crc kubenswrapper[4742]: I0317 11:32:21.744896 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcxvg\" (UniqueName: \"kubernetes.io/projected/204cceda-eecf-48b2-b808-d2981ea6f0be-kube-api-access-qcxvg\") pod \"204cceda-eecf-48b2-b808-d2981ea6f0be\" (UID: \"204cceda-eecf-48b2-b808-d2981ea6f0be\") " Mar 17 11:32:21 crc kubenswrapper[4742]: I0317 11:32:21.750860 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/204cceda-eecf-48b2-b808-d2981ea6f0be-kube-api-access-qcxvg" (OuterVolumeSpecName: "kube-api-access-qcxvg") pod "204cceda-eecf-48b2-b808-d2981ea6f0be" (UID: "204cceda-eecf-48b2-b808-d2981ea6f0be"). InnerVolumeSpecName "kube-api-access-qcxvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:21 crc kubenswrapper[4742]: I0317 11:32:21.781009 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204cceda-eecf-48b2-b808-d2981ea6f0be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "204cceda-eecf-48b2-b808-d2981ea6f0be" (UID: "204cceda-eecf-48b2-b808-d2981ea6f0be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:21 crc kubenswrapper[4742]: I0317 11:32:21.795616 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204cceda-eecf-48b2-b808-d2981ea6f0be-config-data" (OuterVolumeSpecName: "config-data") pod "204cceda-eecf-48b2-b808-d2981ea6f0be" (UID: "204cceda-eecf-48b2-b808-d2981ea6f0be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:21 crc kubenswrapper[4742]: I0317 11:32:21.846405 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204cceda-eecf-48b2-b808-d2981ea6f0be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:21 crc kubenswrapper[4742]: I0317 11:32:21.846430 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204cceda-eecf-48b2-b808-d2981ea6f0be-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:21 crc kubenswrapper[4742]: I0317 11:32:21.846440 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcxvg\" (UniqueName: \"kubernetes.io/projected/204cceda-eecf-48b2-b808-d2981ea6f0be-kube-api-access-qcxvg\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.378186 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9srm9" event={"ID":"204cceda-eecf-48b2-b808-d2981ea6f0be","Type":"ContainerDied","Data":"d3533f0fe40e99948ce905a25bc9d84eb337e7938ecaf22d56159bff6a01c4c4"} Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.378241 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3533f0fe40e99948ce905a25bc9d84eb337e7938ecaf22d56159bff6a01c4c4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.378280 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9srm9" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.690041 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-chqr4"] Mar 17 11:32:22 crc kubenswrapper[4742]: E0317 11:32:22.690471 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b7d712-b6f0-43e1-a95b-e49251608407" containerName="init" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.690496 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b7d712-b6f0-43e1-a95b-e49251608407" containerName="init" Mar 17 11:32:22 crc kubenswrapper[4742]: E0317 11:32:22.690522 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201fe27b-47b0-4b1d-89d5-fc37b565a76a" containerName="mariadb-account-create-update" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.690530 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="201fe27b-47b0-4b1d-89d5-fc37b565a76a" containerName="mariadb-account-create-update" Mar 17 11:32:22 crc kubenswrapper[4742]: E0317 11:32:22.690543 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5" containerName="mariadb-database-create" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.690551 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5" containerName="mariadb-database-create" Mar 17 11:32:22 crc kubenswrapper[4742]: E0317 11:32:22.690571 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b7d712-b6f0-43e1-a95b-e49251608407" containerName="dnsmasq-dns" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.690579 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b7d712-b6f0-43e1-a95b-e49251608407" containerName="dnsmasq-dns" Mar 17 11:32:22 crc kubenswrapper[4742]: E0317 11:32:22.690629 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204cceda-eecf-48b2-b808-d2981ea6f0be" containerName="keystone-db-sync" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.690639 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="204cceda-eecf-48b2-b808-d2981ea6f0be" containerName="keystone-db-sync" Mar 17 11:32:22 crc kubenswrapper[4742]: E0317 11:32:22.690655 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949c94b8-282b-40f0-bba5-3865562af774" containerName="mariadb-account-create-update" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.690663 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="949c94b8-282b-40f0-bba5-3865562af774" containerName="mariadb-account-create-update" Mar 17 11:32:22 crc kubenswrapper[4742]: E0317 11:32:22.690676 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11a1d54-dde6-466c-a500-72fd1c349db3" containerName="mariadb-database-create" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.690684 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11a1d54-dde6-466c-a500-72fd1c349db3" containerName="mariadb-database-create" Mar 17 11:32:22 crc kubenswrapper[4742]: E0317 11:32:22.690693 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb9bc7db-7919-4ec4-b316-e8aaeb69b05d" containerName="mariadb-account-create-update" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.690700 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb9bc7db-7919-4ec4-b316-e8aaeb69b05d" containerName="mariadb-account-create-update" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.690878 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="201fe27b-47b0-4b1d-89d5-fc37b565a76a" containerName="mariadb-account-create-update" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.690893 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b7d712-b6f0-43e1-a95b-e49251608407" containerName="dnsmasq-dns" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.691034 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="949c94b8-282b-40f0-bba5-3865562af774" containerName="mariadb-account-create-update" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.691048 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="204cceda-eecf-48b2-b808-d2981ea6f0be" containerName="keystone-db-sync" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.691058 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11a1d54-dde6-466c-a500-72fd1c349db3" containerName="mariadb-database-create" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.691071 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb9bc7db-7919-4ec4-b316-e8aaeb69b05d" containerName="mariadb-account-create-update" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.691093 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5" containerName="mariadb-database-create" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.691733 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.696594 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.696837 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.697005 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.697169 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.697294 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fjbw9" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.722786 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-chqr4"] Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.770353 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-scripts\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.770535 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-credential-keys\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.770688 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-combined-ca-bundle\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.770710 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-config-data\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.770754 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbjch\" (UniqueName: \"kubernetes.io/projected/cb042f72-1b4d-4f10-aecb-697ad9780b29-kube-api-access-gbjch\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.770788 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-fernet-keys\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.812687 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-66ckb"] Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.814023 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.840219 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-66ckb"] Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.874352 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.874410 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-credential-keys\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.874436 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.874453 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-dns-svc\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.874475 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-config\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.874508 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.874538 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-combined-ca-bundle\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.874557 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-config-data\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.874582 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbjch\" (UniqueName: \"kubernetes.io/projected/cb042f72-1b4d-4f10-aecb-697ad9780b29-kube-api-access-gbjch\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.874604 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-fernet-keys\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.874640 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km42w\" (UniqueName: \"kubernetes.io/projected/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-kube-api-access-km42w\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.874675 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-scripts\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.881335 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-credential-keys\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.896929 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-combined-ca-bundle\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.902180 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-fernet-keys\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.908507 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-config-data\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.910428 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-scripts\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.923588 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbjch\" (UniqueName: \"kubernetes.io/projected/cb042f72-1b4d-4f10-aecb-697ad9780b29-kube-api-access-gbjch\") pod \"keystone-bootstrap-chqr4\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.936457 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68d65d5b97-9g9h5"] Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.937752 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.948273 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.948334 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-xpskd" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.948699 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.948817 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.966610 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7kmxq"] Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.967664 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7kmxq" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.973265 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.974488 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ptsjt" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.974691 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.975582 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.975622 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.975639 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-dns-svc\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.975658 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-config\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.975689 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jqsd\" (UniqueName: \"kubernetes.io/projected/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-kube-api-access-2jqsd\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.975725 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.975787 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-config-data\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.975814 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-horizon-secret-key\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.975839 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-scripts\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.975867 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km42w\" (UniqueName: \"kubernetes.io/projected/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-kube-api-access-km42w\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.975934 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-logs\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.976780 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.976809 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.977307 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.977858 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-config\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.979646 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-dns-svc\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:22 crc kubenswrapper[4742]: I0317 11:32:22.996179 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7kmxq"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.014169 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68d65d5b97-9g9h5"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.014686 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.038837 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.040706 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.041336 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km42w\" (UniqueName: \"kubernetes.io/projected/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-kube-api-access-km42w\") pod \"dnsmasq-dns-847c4cc679-66ckb\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.065345 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.065597 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.076717 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.080655 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.080705 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-config-data\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.080781 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jqsd\" (UniqueName: \"kubernetes.io/projected/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-kube-api-access-2jqsd\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.080829 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4lzd\" (UniqueName: \"kubernetes.io/projected/603a0e75-694a-4ab5-bbe0-616f617bc949-kube-api-access-b4lzd\") pod \"neutron-db-sync-7kmxq\" (UID: \"603a0e75-694a-4ab5-bbe0-616f617bc949\") " pod="openstack/neutron-db-sync-7kmxq" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.080857 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-scripts\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.080876 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/603a0e75-694a-4ab5-bbe0-616f617bc949-config\") pod \"neutron-db-sync-7kmxq\" (UID: \"603a0e75-694a-4ab5-bbe0-616f617bc949\") " pod="openstack/neutron-db-sync-7kmxq" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.080929 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.080952 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-config-data\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.080966 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-run-httpd\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.080981 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603a0e75-694a-4ab5-bbe0-616f617bc949-combined-ca-bundle\") pod \"neutron-db-sync-7kmxq\" (UID: \"603a0e75-694a-4ab5-bbe0-616f617bc949\") " pod="openstack/neutron-db-sync-7kmxq" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.081007 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-horizon-secret-key\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.081026 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-scripts\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.081049 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-log-httpd\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.081076 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-logs\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.081094 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd8tp\" (UniqueName: \"kubernetes.io/projected/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-kube-api-access-kd8tp\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.086525 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-scripts\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.086743 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-logs\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.090844 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-config-data\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.104469 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-horizon-secret-key\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.131618 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jqsd\" (UniqueName: \"kubernetes.io/projected/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-kube-api-access-2jqsd\") pod \"horizon-68d65d5b97-9g9h5\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.131686 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-7mmzn"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.132687 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.136306 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.142298 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.142485 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tcgpz" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.146790 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.162036 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-cs4pt"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.163142 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cs4pt" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.172155 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.175297 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dd6kx" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190648 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-combined-ca-bundle\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190697 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4lzd\" (UniqueName: \"kubernetes.io/projected/603a0e75-694a-4ab5-bbe0-616f617bc949-kube-api-access-b4lzd\") pod \"neutron-db-sync-7kmxq\" (UID: \"603a0e75-694a-4ab5-bbe0-616f617bc949\") " pod="openstack/neutron-db-sync-7kmxq" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190720 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-scripts\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190740 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/603a0e75-694a-4ab5-bbe0-616f617bc949-config\") pod \"neutron-db-sync-7kmxq\" (UID: \"603a0e75-694a-4ab5-bbe0-616f617bc949\") " pod="openstack/neutron-db-sync-7kmxq" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190755 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-config-data\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190770 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-scripts\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190800 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190816 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mmvj\" (UniqueName: \"kubernetes.io/projected/e3261b59-fc08-4596-bda8-7b398ef979e4-kube-api-access-5mmvj\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190837 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-run-httpd\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190853 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603a0e75-694a-4ab5-bbe0-616f617bc949-combined-ca-bundle\") pod \"neutron-db-sync-7kmxq\" (UID: \"603a0e75-694a-4ab5-bbe0-616f617bc949\") " pod="openstack/neutron-db-sync-7kmxq" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190874 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3261b59-fc08-4596-bda8-7b398ef979e4-logs\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190894 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-log-httpd\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190934 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd8tp\" (UniqueName: \"kubernetes.io/projected/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-kube-api-access-kd8tp\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190956 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.190976 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-config-data\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.191652 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-run-httpd\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.200773 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.201484 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-log-httpd\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.208011 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7mmzn"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.208295 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603a0e75-694a-4ab5-bbe0-616f617bc949-combined-ca-bundle\") pod \"neutron-db-sync-7kmxq\" (UID: \"603a0e75-694a-4ab5-bbe0-616f617bc949\") " pod="openstack/neutron-db-sync-7kmxq" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.208453 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-config-data\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.210238 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.213138 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/603a0e75-694a-4ab5-bbe0-616f617bc949-config\") pod \"neutron-db-sync-7kmxq\" (UID: \"603a0e75-694a-4ab5-bbe0-616f617bc949\") " pod="openstack/neutron-db-sync-7kmxq" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.218803 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-scripts\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.231349 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4lzd\" (UniqueName: \"kubernetes.io/projected/603a0e75-694a-4ab5-bbe0-616f617bc949-kube-api-access-b4lzd\") pod \"neutron-db-sync-7kmxq\" (UID: \"603a0e75-694a-4ab5-bbe0-616f617bc949\") " pod="openstack/neutron-db-sync-7kmxq" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.239335 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd8tp\" (UniqueName: \"kubernetes.io/projected/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-kube-api-access-kd8tp\") pod \"ceilometer-0\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.251139 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cs4pt"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.292245 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-66ckb"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.300828 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-config-data\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.300937 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-scripts\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.301079 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mmvj\" (UniqueName: \"kubernetes.io/projected/e3261b59-fc08-4596-bda8-7b398ef979e4-kube-api-access-5mmvj\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.301155 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3261b59-fc08-4596-bda8-7b398ef979e4-logs\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.301183 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90b52e42-6eca-4585-95a0-057055089c97-db-sync-config-data\") pod \"barbican-db-sync-cs4pt\" (UID: \"90b52e42-6eca-4585-95a0-057055089c97\") " pod="openstack/barbican-db-sync-cs4pt" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.301236 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b52e42-6eca-4585-95a0-057055089c97-combined-ca-bundle\") pod \"barbican-db-sync-cs4pt\" (UID: \"90b52e42-6eca-4585-95a0-057055089c97\") " pod="openstack/barbican-db-sync-cs4pt" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.301379 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96fnv\" (UniqueName: \"kubernetes.io/projected/90b52e42-6eca-4585-95a0-057055089c97-kube-api-access-96fnv\") pod \"barbican-db-sync-cs4pt\" (UID: \"90b52e42-6eca-4585-95a0-057055089c97\") " pod="openstack/barbican-db-sync-cs4pt" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.301411 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-combined-ca-bundle\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.302786 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.305205 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3261b59-fc08-4596-bda8-7b398ef979e4-logs\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.309286 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-combined-ca-bundle\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.319508 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-scripts\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.335605 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-config-data\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.336106 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mmvj\" (UniqueName: \"kubernetes.io/projected/e3261b59-fc08-4596-bda8-7b398ef979e4-kube-api-access-5mmvj\") pod \"placement-db-sync-7mmzn\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.376968 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6558577bcc-xrft9"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.378205 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.403396 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6558577bcc-xrft9"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.403680 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90b52e42-6eca-4585-95a0-057055089c97-db-sync-config-data\") pod \"barbican-db-sync-cs4pt\" (UID: \"90b52e42-6eca-4585-95a0-057055089c97\") " pod="openstack/barbican-db-sync-cs4pt" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.403732 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b52e42-6eca-4585-95a0-057055089c97-combined-ca-bundle\") pod \"barbican-db-sync-cs4pt\" (UID: \"90b52e42-6eca-4585-95a0-057055089c97\") " pod="openstack/barbican-db-sync-cs4pt" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.403792 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96fnv\" (UniqueName: \"kubernetes.io/projected/90b52e42-6eca-4585-95a0-057055089c97-kube-api-access-96fnv\") pod \"barbican-db-sync-cs4pt\" (UID: \"90b52e42-6eca-4585-95a0-057055089c97\") " pod="openstack/barbican-db-sync-cs4pt" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.421031 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90b52e42-6eca-4585-95a0-057055089c97-db-sync-config-data\") pod \"barbican-db-sync-cs4pt\" (UID: \"90b52e42-6eca-4585-95a0-057055089c97\") " pod="openstack/barbican-db-sync-cs4pt" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.423923 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b52e42-6eca-4585-95a0-057055089c97-combined-ca-bundle\") pod \"barbican-db-sync-cs4pt\" (UID: \"90b52e42-6eca-4585-95a0-057055089c97\") " pod="openstack/barbican-db-sync-cs4pt" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.430637 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96fnv\" (UniqueName: \"kubernetes.io/projected/90b52e42-6eca-4585-95a0-057055089c97-kube-api-access-96fnv\") pod \"barbican-db-sync-cs4pt\" (UID: \"90b52e42-6eca-4585-95a0-057055089c97\") " pod="openstack/barbican-db-sync-cs4pt" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.443727 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-qzc74"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.444743 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.448488 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7kmxq" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.453370 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.455412 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.465757 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.465783 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mknj4" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.466077 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rh74z" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.466336 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.466559 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.466560 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.470139 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.490237 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.501933 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5lt24"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.503168 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.506676 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.506711 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-scripts\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.506729 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.506765 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nf5v\" (UniqueName: \"kubernetes.io/projected/f848d4a4-4dba-4636-942d-340d83b7750b-kube-api-access-6nf5v\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.506784 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f848d4a4-4dba-4636-942d-340d83b7750b-logs\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.506804 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0670252-0ef9-4bec-bd86-a96560faf4d4-logs\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.506822 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-db-sync-config-data\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.506843 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f848d4a4-4dba-4636-942d-340d83b7750b-horizon-secret-key\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.506869 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f848d4a4-4dba-4636-942d-340d83b7750b-scripts\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.510176 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-config-data\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.510259 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-combined-ca-bundle\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.510291 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxtkz\" (UniqueName: \"kubernetes.io/projected/5c75af6d-6842-49b5-aebe-54feb0644942-kube-api-access-fxtkz\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.510315 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-config-data\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.510333 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0670252-0ef9-4bec-bd86-a96560faf4d4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.510378 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbznv\" (UniqueName: \"kubernetes.io/projected/e0670252-0ef9-4bec-bd86-a96560faf4d4-kube-api-access-nbznv\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.510458 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-scripts\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.510502 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.510604 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f848d4a4-4dba-4636-942d-340d83b7750b-config-data\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.510641 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c75af6d-6842-49b5-aebe-54feb0644942-etc-machine-id\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.516656 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qzc74"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.519113 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.523824 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5lt24"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.530670 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.547442 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cs4pt" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.572830 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.575009 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.577867 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.578662 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.595984 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615130 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9cb4\" (UniqueName: \"kubernetes.io/projected/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-kube-api-access-c9cb4\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615177 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-config\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615203 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-scripts\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615229 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615249 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f848d4a4-4dba-4636-942d-340d83b7750b-config-data\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615263 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c75af6d-6842-49b5-aebe-54feb0644942-etc-machine-id\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615292 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615310 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-scripts\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615327 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615371 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nf5v\" (UniqueName: \"kubernetes.io/projected/f848d4a4-4dba-4636-942d-340d83b7750b-kube-api-access-6nf5v\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615391 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f848d4a4-4dba-4636-942d-340d83b7750b-logs\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615411 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0670252-0ef9-4bec-bd86-a96560faf4d4-logs\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615428 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615449 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-db-sync-config-data\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615465 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f848d4a4-4dba-4636-942d-340d83b7750b-horizon-secret-key\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615494 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615514 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f848d4a4-4dba-4636-942d-340d83b7750b-scripts\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615532 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-config-data\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615547 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615585 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-combined-ca-bundle\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615608 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxtkz\" (UniqueName: \"kubernetes.io/projected/5c75af6d-6842-49b5-aebe-54feb0644942-kube-api-access-fxtkz\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615632 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-config-data\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615653 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0670252-0ef9-4bec-bd86-a96560faf4d4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615683 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.615710 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbznv\" (UniqueName: \"kubernetes.io/projected/e0670252-0ef9-4bec-bd86-a96560faf4d4-kube-api-access-nbznv\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.618613 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.621986 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f848d4a4-4dba-4636-942d-340d83b7750b-config-data\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.622068 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c75af6d-6842-49b5-aebe-54feb0644942-etc-machine-id\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.623581 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-scripts\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.627610 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f848d4a4-4dba-4636-942d-340d83b7750b-logs\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.627701 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f848d4a4-4dba-4636-942d-340d83b7750b-horizon-secret-key\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.628225 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0670252-0ef9-4bec-bd86-a96560faf4d4-logs\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.629064 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f848d4a4-4dba-4636-942d-340d83b7750b-scripts\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.629442 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-scripts\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.629691 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-db-sync-config-data\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.630795 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0670252-0ef9-4bec-bd86-a96560faf4d4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.648788 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-combined-ca-bundle\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.648857 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-config-data\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.649859 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.650033 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.652277 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxtkz\" (UniqueName: \"kubernetes.io/projected/5c75af6d-6842-49b5-aebe-54feb0644942-kube-api-access-fxtkz\") pod \"cinder-db-sync-qzc74\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.655385 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-config-data\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.655888 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nf5v\" (UniqueName: \"kubernetes.io/projected/f848d4a4-4dba-4636-942d-340d83b7750b-kube-api-access-6nf5v\") pod \"horizon-6558577bcc-xrft9\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.663097 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbznv\" (UniqueName: \"kubernetes.io/projected/e0670252-0ef9-4bec-bd86-a96560faf4d4-kube-api-access-nbznv\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.693756 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.716964 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.717019 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.717041 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.717082 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9cb4\" (UniqueName: \"kubernetes.io/projected/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-kube-api-access-c9cb4\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.717103 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08112cc7-14e9-4a19-b51d-2881583289a7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.717125 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-config\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.717144 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z46fj\" (UniqueName: \"kubernetes.io/projected/08112cc7-14e9-4a19-b51d-2881583289a7-kube-api-access-z46fj\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.717163 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.717227 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.717242 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08112cc7-14e9-4a19-b51d-2881583289a7-logs\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.717269 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.717289 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.717308 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.717326 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.718220 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.720106 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-config\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.720211 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.720689 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.723749 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.725947 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.744195 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9cb4\" (UniqueName: \"kubernetes.io/projected/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-kube-api-access-c9cb4\") pod \"dnsmasq-dns-785d8bcb8c-5lt24\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.802267 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qzc74" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.819025 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.819108 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08112cc7-14e9-4a19-b51d-2881583289a7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.819152 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z46fj\" (UniqueName: \"kubernetes.io/projected/08112cc7-14e9-4a19-b51d-2881583289a7-kube-api-access-z46fj\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.819179 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.819263 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08112cc7-14e9-4a19-b51d-2881583289a7-logs\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.819332 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.819356 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.819387 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.819762 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.822308 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08112cc7-14e9-4a19-b51d-2881583289a7-logs\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.822972 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08112cc7-14e9-4a19-b51d-2881583289a7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.831831 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.832312 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.834195 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.835117 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.850599 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.851254 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z46fj\" (UniqueName: \"kubernetes.io/projected/08112cc7-14e9-4a19-b51d-2881583289a7-kube-api-access-z46fj\") pod \"glance-default-internal-api-0\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.875659 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.908085 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:23 crc kubenswrapper[4742]: I0317 11:32:23.950195 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.006400 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-chqr4"] Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.029515 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-66ckb"] Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.143954 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68d65d5b97-9g9h5"] Mar 17 11:32:24 crc kubenswrapper[4742]: W0317 11:32:24.150604 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2e1f113_344f_4703_9ba2_d4aabebeb1d7.slice/crio-f2df3fe58722e84de899f4186e32edd3451a9ed83c29faa704c45165af53f15d WatchSource:0}: Error finding container f2df3fe58722e84de899f4186e32edd3451a9ed83c29faa704c45165af53f15d: Status 404 returned error can't find the container with id f2df3fe58722e84de899f4186e32edd3451a9ed83c29faa704c45165af53f15d Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.175404 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.185709 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ctgr4"] Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.187367 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ctgr4" Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.189959 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.194869 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ctgr4"] Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.329586 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8pqh\" (UniqueName: \"kubernetes.io/projected/8976984d-7132-4b32-9246-e387fb5fa905-kube-api-access-f8pqh\") pod \"root-account-create-update-ctgr4\" (UID: \"8976984d-7132-4b32-9246-e387fb5fa905\") " pod="openstack/root-account-create-update-ctgr4" Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.330025 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8976984d-7132-4b32-9246-e387fb5fa905-operator-scripts\") pod \"root-account-create-update-ctgr4\" (UID: \"8976984d-7132-4b32-9246-e387fb5fa905\") " pod="openstack/root-account-create-update-ctgr4" Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.424005 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68d65d5b97-9g9h5" event={"ID":"d2e1f113-344f-4703-9ba2-d4aabebeb1d7","Type":"ContainerStarted","Data":"f2df3fe58722e84de899f4186e32edd3451a9ed83c29faa704c45165af53f15d"} Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.431227 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8pqh\" (UniqueName: \"kubernetes.io/projected/8976984d-7132-4b32-9246-e387fb5fa905-kube-api-access-f8pqh\") pod \"root-account-create-update-ctgr4\" (UID: \"8976984d-7132-4b32-9246-e387fb5fa905\") " pod="openstack/root-account-create-update-ctgr4" Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.431311 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8976984d-7132-4b32-9246-e387fb5fa905-operator-scripts\") pod \"root-account-create-update-ctgr4\" (UID: \"8976984d-7132-4b32-9246-e387fb5fa905\") " pod="openstack/root-account-create-update-ctgr4" Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.432101 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8976984d-7132-4b32-9246-e387fb5fa905-operator-scripts\") pod \"root-account-create-update-ctgr4\" (UID: \"8976984d-7132-4b32-9246-e387fb5fa905\") " pod="openstack/root-account-create-update-ctgr4" Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.433352 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-66ckb" event={"ID":"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef","Type":"ContainerStarted","Data":"cfca32460870cb47c5ae185754e63a29afad89e9fab87f695108afc758bae42d"} Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.437252 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-chqr4" event={"ID":"cb042f72-1b4d-4f10-aecb-697ad9780b29","Type":"ContainerStarted","Data":"feb70d42951f4c2633fb7aea964a231765460e6cac94872338c44c37aeedc47d"} Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.464560 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8pqh\" (UniqueName: \"kubernetes.io/projected/8976984d-7132-4b32-9246-e387fb5fa905-kube-api-access-f8pqh\") pod \"root-account-create-update-ctgr4\" (UID: \"8976984d-7132-4b32-9246-e387fb5fa905\") " pod="openstack/root-account-create-update-ctgr4" Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.506509 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.511735 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ctgr4" Mar 17 11:32:24 crc kubenswrapper[4742]: W0317 11:32:24.519519 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod603a0e75_694a_4ab5_bbe0_616f617bc949.slice/crio-eb983291c14ffe6e637c5e5a901977c8be9d712e13ab649764eac90d5bc1499b WatchSource:0}: Error finding container eb983291c14ffe6e637c5e5a901977c8be9d712e13ab649764eac90d5bc1499b: Status 404 returned error can't find the container with id eb983291c14ffe6e637c5e5a901977c8be9d712e13ab649764eac90d5bc1499b Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.526971 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7kmxq"] Mar 17 11:32:24 crc kubenswrapper[4742]: W0317 11:32:24.533520 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecfcf738_372c_42d4_a4b0_c1f88be1dd43.slice/crio-53deb4fa8067e4c621e16c5328ef6180397f859dc31bf31e84cbd9ddbd1ab926 WatchSource:0}: Error finding container 53deb4fa8067e4c621e16c5328ef6180397f859dc31bf31e84cbd9ddbd1ab926: Status 404 returned error can't find the container with id 53deb4fa8067e4c621e16c5328ef6180397f859dc31bf31e84cbd9ddbd1ab926 Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.560437 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7mmzn"] Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.596968 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cs4pt"] Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.604579 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6558577bcc-xrft9"] Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.625477 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qzc74"] Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.720741 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5lt24"] Mar 17 11:32:24 crc kubenswrapper[4742]: I0317 11:32:24.898253 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.106469 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.122527 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68d65d5b97-9g9h5"] Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.191533 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-647cff84fc-lltcg"] Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.194604 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.215514 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.246644 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-647cff84fc-lltcg"] Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.278986 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ctgr4"] Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.302479 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.356852 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-scripts\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.356900 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-logs\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.356938 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-horizon-secret-key\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.357077 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-config-data\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.357130 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd9j2\" (UniqueName: \"kubernetes.io/projected/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-kube-api-access-wd9j2\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.459097 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd9j2\" (UniqueName: \"kubernetes.io/projected/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-kube-api-access-wd9j2\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.459201 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-scripts\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.459224 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-logs\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.459247 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-horizon-secret-key\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.459291 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-config-data\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.459694 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-logs\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.460204 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-scripts\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.460547 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-config-data\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.464864 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6558577bcc-xrft9" event={"ID":"f848d4a4-4dba-4636-942d-340d83b7750b","Type":"ContainerStarted","Data":"17e08657e129d3e0cee1cedb9965e1416c7f09ca8da92ba11f3b7479a45bea30"} Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.467075 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cs4pt" event={"ID":"90b52e42-6eca-4585-95a0-057055089c97","Type":"ContainerStarted","Data":"f707fa9dcb2b6f1652899279bd1a27da340fd0c855146353fc118656f7e0bd11"} Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.470119 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7kmxq" event={"ID":"603a0e75-694a-4ab5-bbe0-616f617bc949","Type":"ContainerStarted","Data":"8a4d5a7aec20b9bffbb6a7e48f746ea28834748dbe582be542f07299629eb0dc"} Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.470165 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7kmxq" event={"ID":"603a0e75-694a-4ab5-bbe0-616f617bc949","Type":"ContainerStarted","Data":"eb983291c14ffe6e637c5e5a901977c8be9d712e13ab649764eac90d5bc1499b"} Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.472710 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-horizon-secret-key\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.482590 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd9j2\" (UniqueName: \"kubernetes.io/projected/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-kube-api-access-wd9j2\") pod \"horizon-647cff84fc-lltcg\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.492439 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7kmxq" podStartSLOduration=3.492424082 podStartE2EDuration="3.492424082s" podCreationTimestamp="2026-03-17 11:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:25.491875836 +0000 UTC m=+1248.618003594" watchObservedRunningTime="2026-03-17 11:32:25.492424082 +0000 UTC m=+1248.618551840" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.493296 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ctgr4" event={"ID":"8976984d-7132-4b32-9246-e387fb5fa905","Type":"ContainerStarted","Data":"234797f939f5ff8495f40cd31681f6e5f10ff280c5a645ed29cf83d22c086820"} Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.500918 4742 generic.go:334] "Generic (PLEG): container finished" podID="13e9dda0-0eeb-49e6-9b1f-bb357bb752ef" containerID="37d872d6b001f7ef2c1b9f810eff98279937099852cc65bed2e07e60224a2d27" exitCode=0 Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.500994 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-66ckb" event={"ID":"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef","Type":"ContainerDied","Data":"37d872d6b001f7ef2c1b9f810eff98279937099852cc65bed2e07e60224a2d27"} Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.504341 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfcf738-372c-42d4-a4b0-c1f88be1dd43","Type":"ContainerStarted","Data":"53deb4fa8067e4c621e16c5328ef6180397f859dc31bf31e84cbd9ddbd1ab926"} Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.508859 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0670252-0ef9-4bec-bd86-a96560faf4d4","Type":"ContainerStarted","Data":"6602e57a67978eeab1009cde41d73942573000659b3c38fb861c609e8114649f"} Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.511781 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7mmzn" event={"ID":"e3261b59-fc08-4596-bda8-7b398ef979e4","Type":"ContainerStarted","Data":"99fbe9421a6a14651f763389679f8080afc67c31693d12d9a7ff25ca638692f5"} Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.513300 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qzc74" event={"ID":"5c75af6d-6842-49b5-aebe-54feb0644942","Type":"ContainerStarted","Data":"6b4fe8e438beafa20a8e2a80310602c0b0abcceb25ccdd88cdb82163d66d9be7"} Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.514726 4742 generic.go:334] "Generic (PLEG): container finished" podID="3a2ec2cf-0d0e-455c-82af-03ddae3858bd" containerID="b61e82d3be7e8c91ce430324c0ef11f59aeec617d63b6c7eadb8f30bc5b111f0" exitCode=0 Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.514783 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" event={"ID":"3a2ec2cf-0d0e-455c-82af-03ddae3858bd","Type":"ContainerDied","Data":"b61e82d3be7e8c91ce430324c0ef11f59aeec617d63b6c7eadb8f30bc5b111f0"} Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.514807 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" event={"ID":"3a2ec2cf-0d0e-455c-82af-03ddae3858bd","Type":"ContainerStarted","Data":"5f73ea0aff6744ed1ad9f3c7076712de858887d55d0a7b4932cad52091908a97"} Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.521518 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-chqr4" event={"ID":"cb042f72-1b4d-4f10-aecb-697ad9780b29","Type":"ContainerStarted","Data":"27ebc91815b4e6960eebc1a252e32864ae7dbbb0a6c3f4a56b1ebb4dccc14eaa"} Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.583414 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.604943 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-chqr4" podStartSLOduration=3.604925555 podStartE2EDuration="3.604925555s" podCreationTimestamp="2026-03-17 11:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:25.559080168 +0000 UTC m=+1248.685207926" watchObservedRunningTime="2026-03-17 11:32:25.604925555 +0000 UTC m=+1248.731053313" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.821137 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.874841 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:32:25 crc kubenswrapper[4742]: E0317 11:32:25.970619 4742 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8976984d_7132_4b32_9246_e387fb5fa905.slice/crio-conmon-0eba506e27377ff6e5b042e006a886dd1dcb2d7f32146237492a30397fd488f3.scope\": RecentStats: unable to find data in memory cache]" Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.971894 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-dns-swift-storage-0\") pod \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.972094 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-ovsdbserver-sb\") pod \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.972139 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km42w\" (UniqueName: \"kubernetes.io/projected/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-kube-api-access-km42w\") pod \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.972162 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-config\") pod \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.972189 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-ovsdbserver-nb\") pod \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " Mar 17 11:32:25 crc kubenswrapper[4742]: I0317 11:32:25.972232 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-dns-svc\") pod \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\" (UID: \"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef\") " Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:25.999121 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13e9dda0-0eeb-49e6-9b1f-bb357bb752ef" (UID: "13e9dda0-0eeb-49e6-9b1f-bb357bb752ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.001546 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-kube-api-access-km42w" (OuterVolumeSpecName: "kube-api-access-km42w") pod "13e9dda0-0eeb-49e6-9b1f-bb357bb752ef" (UID: "13e9dda0-0eeb-49e6-9b1f-bb357bb752ef"). InnerVolumeSpecName "kube-api-access-km42w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.012610 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "13e9dda0-0eeb-49e6-9b1f-bb357bb752ef" (UID: "13e9dda0-0eeb-49e6-9b1f-bb357bb752ef"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.015591 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-config" (OuterVolumeSpecName: "config") pod "13e9dda0-0eeb-49e6-9b1f-bb357bb752ef" (UID: "13e9dda0-0eeb-49e6-9b1f-bb357bb752ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.034101 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "13e9dda0-0eeb-49e6-9b1f-bb357bb752ef" (UID: "13e9dda0-0eeb-49e6-9b1f-bb357bb752ef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.052956 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "13e9dda0-0eeb-49e6-9b1f-bb357bb752ef" (UID: "13e9dda0-0eeb-49e6-9b1f-bb357bb752ef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.074619 4742 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.074650 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.074659 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km42w\" (UniqueName: \"kubernetes.io/projected/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-kube-api-access-km42w\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.074671 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.074681 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.074689 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.236955 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-647cff84fc-lltcg"] Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.543605 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0670252-0ef9-4bec-bd86-a96560faf4d4","Type":"ContainerStarted","Data":"972b9842c2771d49514f2bb33841fdda64b2d5f3606bf6cf485d924353397228"} Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.546241 4742 generic.go:334] "Generic (PLEG): container finished" podID="8976984d-7132-4b32-9246-e387fb5fa905" containerID="0eba506e27377ff6e5b042e006a886dd1dcb2d7f32146237492a30397fd488f3" exitCode=0 Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.546301 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ctgr4" event={"ID":"8976984d-7132-4b32-9246-e387fb5fa905","Type":"ContainerDied","Data":"0eba506e27377ff6e5b042e006a886dd1dcb2d7f32146237492a30397fd488f3"} Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.573057 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-66ckb" event={"ID":"13e9dda0-0eeb-49e6-9b1f-bb357bb752ef","Type":"ContainerDied","Data":"cfca32460870cb47c5ae185754e63a29afad89e9fab87f695108afc758bae42d"} Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.573106 4742 scope.go:117] "RemoveContainer" containerID="37d872d6b001f7ef2c1b9f810eff98279937099852cc65bed2e07e60224a2d27" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.573070 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-66ckb" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.596338 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08112cc7-14e9-4a19-b51d-2881583289a7","Type":"ContainerStarted","Data":"9a2e1d975daf505cd2aa75621db680ca978dd6563ea5851dc0eaae5acc22d55b"} Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.651923 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-647cff84fc-lltcg" event={"ID":"4eb7c4c3-2dad-464d-8e2c-09e618d140e4","Type":"ContainerStarted","Data":"a5089c9fb0b8d6a07ddef3db40f657c3d1eb01707c6f4bd95b306cc377b1207f"} Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.714566 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-66ckb"] Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.714602 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.714621 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" event={"ID":"3a2ec2cf-0d0e-455c-82af-03ddae3858bd","Type":"ContainerStarted","Data":"a171d31afe2a8fee197c1e824702384e0fe66f168432f5b13b1587e1abe0d3d0"} Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.715266 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-66ckb"] Mar 17 11:32:26 crc kubenswrapper[4742]: I0317 11:32:26.717151 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" podStartSLOduration=3.7171337749999998 podStartE2EDuration="3.717133775s" podCreationTimestamp="2026-03-17 11:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:26.700149972 +0000 UTC m=+1249.826277730" watchObservedRunningTime="2026-03-17 11:32:26.717133775 +0000 UTC m=+1249.843261523" Mar 17 11:32:27 crc kubenswrapper[4742]: I0317 11:32:27.690938 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0670252-0ef9-4bec-bd86-a96560faf4d4","Type":"ContainerStarted","Data":"374d05242d8d6a4be6204eb855b905eaa2bbf7f53aacf4468460a8fdd83164bc"} Mar 17 11:32:27 crc kubenswrapper[4742]: I0317 11:32:27.691421 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e0670252-0ef9-4bec-bd86-a96560faf4d4" containerName="glance-log" containerID="cri-o://972b9842c2771d49514f2bb33841fdda64b2d5f3606bf6cf485d924353397228" gracePeriod=30 Mar 17 11:32:27 crc kubenswrapper[4742]: I0317 11:32:27.691887 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e0670252-0ef9-4bec-bd86-a96560faf4d4" containerName="glance-httpd" containerID="cri-o://374d05242d8d6a4be6204eb855b905eaa2bbf7f53aacf4468460a8fdd83164bc" gracePeriod=30 Mar 17 11:32:27 crc kubenswrapper[4742]: I0317 11:32:27.703356 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08112cc7-14e9-4a19-b51d-2881583289a7","Type":"ContainerStarted","Data":"2d21744bd7eebff5a8be8f9f5e1aa21e2d81803774869f771ed707a1839c7905"} Mar 17 11:32:27 crc kubenswrapper[4742]: I0317 11:32:27.721353 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.721333867 podStartE2EDuration="4.721333867s" podCreationTimestamp="2026-03-17 11:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:27.717118299 +0000 UTC m=+1250.843246057" watchObservedRunningTime="2026-03-17 11:32:27.721333867 +0000 UTC m=+1250.847461625" Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.111511 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ctgr4" Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.228545 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8976984d-7132-4b32-9246-e387fb5fa905-operator-scripts\") pod \"8976984d-7132-4b32-9246-e387fb5fa905\" (UID: \"8976984d-7132-4b32-9246-e387fb5fa905\") " Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.228697 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8pqh\" (UniqueName: \"kubernetes.io/projected/8976984d-7132-4b32-9246-e387fb5fa905-kube-api-access-f8pqh\") pod \"8976984d-7132-4b32-9246-e387fb5fa905\" (UID: \"8976984d-7132-4b32-9246-e387fb5fa905\") " Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.229839 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8976984d-7132-4b32-9246-e387fb5fa905-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8976984d-7132-4b32-9246-e387fb5fa905" (UID: "8976984d-7132-4b32-9246-e387fb5fa905"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.331101 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8976984d-7132-4b32-9246-e387fb5fa905-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.551158 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8976984d-7132-4b32-9246-e387fb5fa905-kube-api-access-f8pqh" (OuterVolumeSpecName: "kube-api-access-f8pqh") pod "8976984d-7132-4b32-9246-e387fb5fa905" (UID: "8976984d-7132-4b32-9246-e387fb5fa905"). InnerVolumeSpecName "kube-api-access-f8pqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.636250 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8pqh\" (UniqueName: \"kubernetes.io/projected/8976984d-7132-4b32-9246-e387fb5fa905-kube-api-access-f8pqh\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.679601 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13e9dda0-0eeb-49e6-9b1f-bb357bb752ef" path="/var/lib/kubelet/pods/13e9dda0-0eeb-49e6-9b1f-bb357bb752ef/volumes" Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.726439 4742 generic.go:334] "Generic (PLEG): container finished" podID="e0670252-0ef9-4bec-bd86-a96560faf4d4" containerID="972b9842c2771d49514f2bb33841fdda64b2d5f3606bf6cf485d924353397228" exitCode=143 Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.726494 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0670252-0ef9-4bec-bd86-a96560faf4d4","Type":"ContainerDied","Data":"972b9842c2771d49514f2bb33841fdda64b2d5f3606bf6cf485d924353397228"} Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.731047 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ctgr4" event={"ID":"8976984d-7132-4b32-9246-e387fb5fa905","Type":"ContainerDied","Data":"234797f939f5ff8495f40cd31681f6e5f10ff280c5a645ed29cf83d22c086820"} Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.731083 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="234797f939f5ff8495f40cd31681f6e5f10ff280c5a645ed29cf83d22c086820" Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.731136 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ctgr4" Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.735866 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="08112cc7-14e9-4a19-b51d-2881583289a7" containerName="glance-log" containerID="cri-o://2d21744bd7eebff5a8be8f9f5e1aa21e2d81803774869f771ed707a1839c7905" gracePeriod=30 Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.735950 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="08112cc7-14e9-4a19-b51d-2881583289a7" containerName="glance-httpd" containerID="cri-o://e947c5858778c74c81296df563038b05eb9389fc602944ad6e75287ae36cc481" gracePeriod=30 Mar 17 11:32:28 crc kubenswrapper[4742]: I0317 11:32:28.870527 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.870509445 podStartE2EDuration="5.870509445s" podCreationTimestamp="2026-03-17 11:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:28.863588723 +0000 UTC m=+1251.989716481" watchObservedRunningTime="2026-03-17 11:32:28.870509445 +0000 UTC m=+1251.996637193" Mar 17 11:32:29 crc kubenswrapper[4742]: I0317 11:32:29.750182 4742 generic.go:334] "Generic (PLEG): container finished" podID="cb042f72-1b4d-4f10-aecb-697ad9780b29" containerID="27ebc91815b4e6960eebc1a252e32864ae7dbbb0a6c3f4a56b1ebb4dccc14eaa" exitCode=0 Mar 17 11:32:29 crc kubenswrapper[4742]: I0317 11:32:29.750236 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-chqr4" event={"ID":"cb042f72-1b4d-4f10-aecb-697ad9780b29","Type":"ContainerDied","Data":"27ebc91815b4e6960eebc1a252e32864ae7dbbb0a6c3f4a56b1ebb4dccc14eaa"} Mar 17 11:32:29 crc kubenswrapper[4742]: I0317 11:32:29.754140 4742 generic.go:334] "Generic (PLEG): container finished" podID="e0670252-0ef9-4bec-bd86-a96560faf4d4" containerID="374d05242d8d6a4be6204eb855b905eaa2bbf7f53aacf4468460a8fdd83164bc" exitCode=0 Mar 17 11:32:29 crc kubenswrapper[4742]: I0317 11:32:29.754197 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0670252-0ef9-4bec-bd86-a96560faf4d4","Type":"ContainerDied","Data":"374d05242d8d6a4be6204eb855b905eaa2bbf7f53aacf4468460a8fdd83164bc"} Mar 17 11:32:29 crc kubenswrapper[4742]: I0317 11:32:29.760255 4742 generic.go:334] "Generic (PLEG): container finished" podID="08112cc7-14e9-4a19-b51d-2881583289a7" containerID="e947c5858778c74c81296df563038b05eb9389fc602944ad6e75287ae36cc481" exitCode=143 Mar 17 11:32:29 crc kubenswrapper[4742]: I0317 11:32:29.760283 4742 generic.go:334] "Generic (PLEG): container finished" podID="08112cc7-14e9-4a19-b51d-2881583289a7" containerID="2d21744bd7eebff5a8be8f9f5e1aa21e2d81803774869f771ed707a1839c7905" exitCode=143 Mar 17 11:32:29 crc kubenswrapper[4742]: I0317 11:32:29.760303 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08112cc7-14e9-4a19-b51d-2881583289a7","Type":"ContainerDied","Data":"e947c5858778c74c81296df563038b05eb9389fc602944ad6e75287ae36cc481"} Mar 17 11:32:29 crc kubenswrapper[4742]: I0317 11:32:29.760323 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08112cc7-14e9-4a19-b51d-2881583289a7","Type":"ContainerDied","Data":"2d21744bd7eebff5a8be8f9f5e1aa21e2d81803774869f771ed707a1839c7905"} Mar 17 11:32:30 crc kubenswrapper[4742]: I0317 11:32:30.402642 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ctgr4"] Mar 17 11:32:30 crc kubenswrapper[4742]: I0317 11:32:30.412591 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ctgr4"] Mar 17 11:32:30 crc kubenswrapper[4742]: I0317 11:32:30.687089 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8976984d-7132-4b32-9246-e387fb5fa905" path="/var/lib/kubelet/pods/8976984d-7132-4b32-9246-e387fb5fa905/volumes" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.731665 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6558577bcc-xrft9"] Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.779335 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cbc75d594-mvhf5"] Mar 17 11:32:31 crc kubenswrapper[4742]: E0317 11:32:31.780217 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8976984d-7132-4b32-9246-e387fb5fa905" containerName="mariadb-account-create-update" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.780241 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="8976984d-7132-4b32-9246-e387fb5fa905" containerName="mariadb-account-create-update" Mar 17 11:32:31 crc kubenswrapper[4742]: E0317 11:32:31.780265 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13e9dda0-0eeb-49e6-9b1f-bb357bb752ef" containerName="init" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.780276 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="13e9dda0-0eeb-49e6-9b1f-bb357bb752ef" containerName="init" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.780890 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="8976984d-7132-4b32-9246-e387fb5fa905" containerName="mariadb-account-create-update" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.780939 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="13e9dda0-0eeb-49e6-9b1f-bb357bb752ef" containerName="init" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.782285 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.784249 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.804833 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cbc75d594-mvhf5"] Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.811876 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m62rh\" (UniqueName: \"kubernetes.io/projected/f2bbef92-cd02-42d8-b81d-ab7248e29328-kube-api-access-m62rh\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.811945 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-combined-ca-bundle\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.811991 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-horizon-secret-key\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.812011 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-horizon-tls-certs\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.812030 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2bbef92-cd02-42d8-b81d-ab7248e29328-scripts\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.812078 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2bbef92-cd02-42d8-b81d-ab7248e29328-logs\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.812094 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2bbef92-cd02-42d8-b81d-ab7248e29328-config-data\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.851452 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-647cff84fc-lltcg"] Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.879325 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c4556b444-kq454"] Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.886208 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.895508 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c4556b444-kq454"] Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.914348 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/480fea20-eab5-4c68-9bc3-9b218ba0b43d-horizon-tls-certs\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.914399 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/480fea20-eab5-4c68-9bc3-9b218ba0b43d-horizon-secret-key\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.914604 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phvhw\" (UniqueName: \"kubernetes.io/projected/480fea20-eab5-4c68-9bc3-9b218ba0b43d-kube-api-access-phvhw\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.914687 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m62rh\" (UniqueName: \"kubernetes.io/projected/f2bbef92-cd02-42d8-b81d-ab7248e29328-kube-api-access-m62rh\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.914733 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-combined-ca-bundle\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.914795 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/480fea20-eab5-4c68-9bc3-9b218ba0b43d-scripts\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.914825 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480fea20-eab5-4c68-9bc3-9b218ba0b43d-combined-ca-bundle\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.914879 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-horizon-secret-key\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.914925 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/480fea20-eab5-4c68-9bc3-9b218ba0b43d-config-data\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.914952 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-horizon-tls-certs\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.914979 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2bbef92-cd02-42d8-b81d-ab7248e29328-scripts\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.915015 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/480fea20-eab5-4c68-9bc3-9b218ba0b43d-logs\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.915097 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2bbef92-cd02-42d8-b81d-ab7248e29328-logs\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.915144 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2bbef92-cd02-42d8-b81d-ab7248e29328-config-data\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.915592 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2bbef92-cd02-42d8-b81d-ab7248e29328-logs\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.917487 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2bbef92-cd02-42d8-b81d-ab7248e29328-config-data\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.920479 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2bbef92-cd02-42d8-b81d-ab7248e29328-scripts\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.920957 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-horizon-tls-certs\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.928442 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-horizon-secret-key\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.928707 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-combined-ca-bundle\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:31 crc kubenswrapper[4742]: I0317 11:32:31.937657 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m62rh\" (UniqueName: \"kubernetes.io/projected/f2bbef92-cd02-42d8-b81d-ab7248e29328-kube-api-access-m62rh\") pod \"horizon-5cbc75d594-mvhf5\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.016933 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/480fea20-eab5-4c68-9bc3-9b218ba0b43d-horizon-tls-certs\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.016977 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/480fea20-eab5-4c68-9bc3-9b218ba0b43d-horizon-secret-key\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.017015 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phvhw\" (UniqueName: \"kubernetes.io/projected/480fea20-eab5-4c68-9bc3-9b218ba0b43d-kube-api-access-phvhw\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.017060 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/480fea20-eab5-4c68-9bc3-9b218ba0b43d-scripts\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.017079 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480fea20-eab5-4c68-9bc3-9b218ba0b43d-combined-ca-bundle\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.017103 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/480fea20-eab5-4c68-9bc3-9b218ba0b43d-config-data\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.017130 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/480fea20-eab5-4c68-9bc3-9b218ba0b43d-logs\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.017490 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/480fea20-eab5-4c68-9bc3-9b218ba0b43d-logs\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.018779 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/480fea20-eab5-4c68-9bc3-9b218ba0b43d-scripts\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.019489 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/480fea20-eab5-4c68-9bc3-9b218ba0b43d-config-data\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.021037 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/480fea20-eab5-4c68-9bc3-9b218ba0b43d-horizon-tls-certs\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.021381 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/480fea20-eab5-4c68-9bc3-9b218ba0b43d-combined-ca-bundle\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.022022 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/480fea20-eab5-4c68-9bc3-9b218ba0b43d-horizon-secret-key\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.037233 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phvhw\" (UniqueName: \"kubernetes.io/projected/480fea20-eab5-4c68-9bc3-9b218ba0b43d-kube-api-access-phvhw\") pod \"horizon-5c4556b444-kq454\" (UID: \"480fea20-eab5-4c68-9bc3-9b218ba0b43d\") " pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.116816 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:32:32 crc kubenswrapper[4742]: I0317 11:32:32.203112 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:32:33 crc kubenswrapper[4742]: I0317 11:32:33.910095 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:32:33 crc kubenswrapper[4742]: I0317 11:32:33.986876 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-w62xn"] Mar 17 11:32:33 crc kubenswrapper[4742]: I0317 11:32:33.987605 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" podUID="1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" containerName="dnsmasq-dns" containerID="cri-o://821f7b0c9ffb1c5b507d88792fc4fdc096a2cfb7a0a31e127fe30419b71a81af" gracePeriod=10 Mar 17 11:32:34 crc kubenswrapper[4742]: I0317 11:32:34.814203 4742 generic.go:334] "Generic (PLEG): container finished" podID="1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" containerID="821f7b0c9ffb1c5b507d88792fc4fdc096a2cfb7a0a31e127fe30419b71a81af" exitCode=0 Mar 17 11:32:34 crc kubenswrapper[4742]: I0317 11:32:34.814281 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" event={"ID":"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c","Type":"ContainerDied","Data":"821f7b0c9ffb1c5b507d88792fc4fdc096a2cfb7a0a31e127fe30419b71a81af"} Mar 17 11:32:35 crc kubenswrapper[4742]: I0317 11:32:35.422424 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-wjx6c"] Mar 17 11:32:35 crc kubenswrapper[4742]: I0317 11:32:35.423770 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wjx6c" Mar 17 11:32:35 crc kubenswrapper[4742]: I0317 11:32:35.426390 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 17 11:32:35 crc kubenswrapper[4742]: I0317 11:32:35.431576 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wjx6c"] Mar 17 11:32:35 crc kubenswrapper[4742]: I0317 11:32:35.479735 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxjsb\" (UniqueName: \"kubernetes.io/projected/23024865-2dad-4a51-af8b-7d7a224c8ce8-kube-api-access-rxjsb\") pod \"root-account-create-update-wjx6c\" (UID: \"23024865-2dad-4a51-af8b-7d7a224c8ce8\") " pod="openstack/root-account-create-update-wjx6c" Mar 17 11:32:35 crc kubenswrapper[4742]: I0317 11:32:35.479811 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23024865-2dad-4a51-af8b-7d7a224c8ce8-operator-scripts\") pod \"root-account-create-update-wjx6c\" (UID: \"23024865-2dad-4a51-af8b-7d7a224c8ce8\") " pod="openstack/root-account-create-update-wjx6c" Mar 17 11:32:35 crc kubenswrapper[4742]: I0317 11:32:35.581793 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxjsb\" (UniqueName: \"kubernetes.io/projected/23024865-2dad-4a51-af8b-7d7a224c8ce8-kube-api-access-rxjsb\") pod \"root-account-create-update-wjx6c\" (UID: \"23024865-2dad-4a51-af8b-7d7a224c8ce8\") " pod="openstack/root-account-create-update-wjx6c" Mar 17 11:32:35 crc kubenswrapper[4742]: I0317 11:32:35.581922 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23024865-2dad-4a51-af8b-7d7a224c8ce8-operator-scripts\") pod \"root-account-create-update-wjx6c\" (UID: \"23024865-2dad-4a51-af8b-7d7a224c8ce8\") " pod="openstack/root-account-create-update-wjx6c" Mar 17 11:32:35 crc kubenswrapper[4742]: I0317 11:32:35.582847 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23024865-2dad-4a51-af8b-7d7a224c8ce8-operator-scripts\") pod \"root-account-create-update-wjx6c\" (UID: \"23024865-2dad-4a51-af8b-7d7a224c8ce8\") " pod="openstack/root-account-create-update-wjx6c" Mar 17 11:32:35 crc kubenswrapper[4742]: I0317 11:32:35.612647 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxjsb\" (UniqueName: \"kubernetes.io/projected/23024865-2dad-4a51-af8b-7d7a224c8ce8-kube-api-access-rxjsb\") pod \"root-account-create-update-wjx6c\" (UID: \"23024865-2dad-4a51-af8b-7d7a224c8ce8\") " pod="openstack/root-account-create-update-wjx6c" Mar 17 11:32:35 crc kubenswrapper[4742]: I0317 11:32:35.793553 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wjx6c" Mar 17 11:32:36 crc kubenswrapper[4742]: I0317 11:32:36.806922 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" podUID="1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: connect: connection refused" Mar 17 11:32:39 crc kubenswrapper[4742]: E0317 11:32:39.401573 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 17 11:32:39 crc kubenswrapper[4742]: E0317 11:32:39.402624 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n79h5d5h645h5d6h578h66ch7hbfh554h7fh546h686h6ch5c5hb4h685h575h546h5fbh585h654h565h545h54fh645h599h84h5bdh648h68fh59dh65fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6nf5v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6558577bcc-xrft9_openstack(f848d4a4-4dba-4636-942d-340d83b7750b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 11:32:39 crc kubenswrapper[4742]: E0317 11:32:39.405845 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6558577bcc-xrft9" podUID="f848d4a4-4dba-4636-942d-340d83b7750b" Mar 17 11:32:39 crc kubenswrapper[4742]: E0317 11:32:39.550589 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 17 11:32:39 crc kubenswrapper[4742]: E0317 11:32:39.550814 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n56dhdh96h4hbfh649h6h57hdch5b6h668h578h67dh64ch5bch65ch645h57dh669h67h567h5ddh665hffhd5h656h564h599h59fh5fbh7fh687q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2jqsd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-68d65d5b97-9g9h5_openstack(d2e1f113-344f-4703-9ba2-d4aabebeb1d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 11:32:39 crc kubenswrapper[4742]: E0317 11:32:39.557082 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-68d65d5b97-9g9h5" podUID="d2e1f113-344f-4703-9ba2-d4aabebeb1d7" Mar 17 11:32:40 crc kubenswrapper[4742]: I0317 11:32:40.568493 4742 scope.go:117] "RemoveContainer" containerID="e9c18b34265ae35ef76b3212e8872778d92697fbfc3fb669589557c198c9c394" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.508486 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.514721 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.527133 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607340 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-combined-ca-bundle\") pod \"e0670252-0ef9-4bec-bd86-a96560faf4d4\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607398 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-scripts\") pod \"08112cc7-14e9-4a19-b51d-2881583289a7\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607445 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z46fj\" (UniqueName: \"kubernetes.io/projected/08112cc7-14e9-4a19-b51d-2881583289a7-kube-api-access-z46fj\") pod \"08112cc7-14e9-4a19-b51d-2881583289a7\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607502 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08112cc7-14e9-4a19-b51d-2881583289a7-httpd-run\") pod \"08112cc7-14e9-4a19-b51d-2881583289a7\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607570 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-config-data\") pod \"e0670252-0ef9-4bec-bd86-a96560faf4d4\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607622 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0670252-0ef9-4bec-bd86-a96560faf4d4-httpd-run\") pod \"e0670252-0ef9-4bec-bd86-a96560faf4d4\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607649 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-internal-tls-certs\") pod \"08112cc7-14e9-4a19-b51d-2881583289a7\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607669 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-combined-ca-bundle\") pod \"08112cc7-14e9-4a19-b51d-2881583289a7\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607701 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-scripts\") pod \"e0670252-0ef9-4bec-bd86-a96560faf4d4\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607752 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e0670252-0ef9-4bec-bd86-a96560faf4d4\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607776 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-config-data\") pod \"08112cc7-14e9-4a19-b51d-2881583289a7\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607801 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbznv\" (UniqueName: \"kubernetes.io/projected/e0670252-0ef9-4bec-bd86-a96560faf4d4-kube-api-access-nbznv\") pod \"e0670252-0ef9-4bec-bd86-a96560faf4d4\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607835 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"08112cc7-14e9-4a19-b51d-2881583289a7\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607857 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08112cc7-14e9-4a19-b51d-2881583289a7-logs\") pod \"08112cc7-14e9-4a19-b51d-2881583289a7\" (UID: \"08112cc7-14e9-4a19-b51d-2881583289a7\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.607883 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0670252-0ef9-4bec-bd86-a96560faf4d4-logs\") pod \"e0670252-0ef9-4bec-bd86-a96560faf4d4\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.608111 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-public-tls-certs\") pod \"e0670252-0ef9-4bec-bd86-a96560faf4d4\" (UID: \"e0670252-0ef9-4bec-bd86-a96560faf4d4\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.610575 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08112cc7-14e9-4a19-b51d-2881583289a7-logs" (OuterVolumeSpecName: "logs") pod "08112cc7-14e9-4a19-b51d-2881583289a7" (UID: "08112cc7-14e9-4a19-b51d-2881583289a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.610588 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08112cc7-14e9-4a19-b51d-2881583289a7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "08112cc7-14e9-4a19-b51d-2881583289a7" (UID: "08112cc7-14e9-4a19-b51d-2881583289a7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.612602 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0670252-0ef9-4bec-bd86-a96560faf4d4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e0670252-0ef9-4bec-bd86-a96560faf4d4" (UID: "e0670252-0ef9-4bec-bd86-a96560faf4d4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.612764 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0670252-0ef9-4bec-bd86-a96560faf4d4-logs" (OuterVolumeSpecName: "logs") pod "e0670252-0ef9-4bec-bd86-a96560faf4d4" (UID: "e0670252-0ef9-4bec-bd86-a96560faf4d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.617394 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "08112cc7-14e9-4a19-b51d-2881583289a7" (UID: "08112cc7-14e9-4a19-b51d-2881583289a7"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.618799 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0670252-0ef9-4bec-bd86-a96560faf4d4-kube-api-access-nbznv" (OuterVolumeSpecName: "kube-api-access-nbznv") pod "e0670252-0ef9-4bec-bd86-a96560faf4d4" (UID: "e0670252-0ef9-4bec-bd86-a96560faf4d4"). InnerVolumeSpecName "kube-api-access-nbznv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.631599 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "e0670252-0ef9-4bec-bd86-a96560faf4d4" (UID: "e0670252-0ef9-4bec-bd86-a96560faf4d4"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.631609 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-scripts" (OuterVolumeSpecName: "scripts") pod "e0670252-0ef9-4bec-bd86-a96560faf4d4" (UID: "e0670252-0ef9-4bec-bd86-a96560faf4d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.651415 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0670252-0ef9-4bec-bd86-a96560faf4d4" (UID: "e0670252-0ef9-4bec-bd86-a96560faf4d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.672930 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-scripts" (OuterVolumeSpecName: "scripts") pod "08112cc7-14e9-4a19-b51d-2881583289a7" (UID: "08112cc7-14e9-4a19-b51d-2881583289a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.673529 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08112cc7-14e9-4a19-b51d-2881583289a7-kube-api-access-z46fj" (OuterVolumeSpecName: "kube-api-access-z46fj") pod "08112cc7-14e9-4a19-b51d-2881583289a7" (UID: "08112cc7-14e9-4a19-b51d-2881583289a7"). InnerVolumeSpecName "kube-api-access-z46fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.693585 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-config-data" (OuterVolumeSpecName: "config-data") pod "08112cc7-14e9-4a19-b51d-2881583289a7" (UID: "08112cc7-14e9-4a19-b51d-2881583289a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.697337 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "08112cc7-14e9-4a19-b51d-2881583289a7" (UID: "08112cc7-14e9-4a19-b51d-2881583289a7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.701550 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e0670252-0ef9-4bec-bd86-a96560faf4d4" (UID: "e0670252-0ef9-4bec-bd86-a96560faf4d4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.703174 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08112cc7-14e9-4a19-b51d-2881583289a7" (UID: "08112cc7-14e9-4a19-b51d-2881583289a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.705200 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-config-data" (OuterVolumeSpecName: "config-data") pod "e0670252-0ef9-4bec-bd86-a96560faf4d4" (UID: "e0670252-0ef9-4bec-bd86-a96560faf4d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.709834 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-fernet-keys\") pod \"cb042f72-1b4d-4f10-aecb-697ad9780b29\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.709931 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-config-data\") pod \"cb042f72-1b4d-4f10-aecb-697ad9780b29\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710039 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-combined-ca-bundle\") pod \"cb042f72-1b4d-4f10-aecb-697ad9780b29\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710092 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbjch\" (UniqueName: \"kubernetes.io/projected/cb042f72-1b4d-4f10-aecb-697ad9780b29-kube-api-access-gbjch\") pod \"cb042f72-1b4d-4f10-aecb-697ad9780b29\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710120 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-credential-keys\") pod \"cb042f72-1b4d-4f10-aecb-697ad9780b29\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710226 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-scripts\") pod \"cb042f72-1b4d-4f10-aecb-697ad9780b29\" (UID: \"cb042f72-1b4d-4f10-aecb-697ad9780b29\") " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710693 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0670252-0ef9-4bec-bd86-a96560faf4d4-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710714 4742 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710725 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710735 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710746 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z46fj\" (UniqueName: \"kubernetes.io/projected/08112cc7-14e9-4a19-b51d-2881583289a7-kube-api-access-z46fj\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710756 4742 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/08112cc7-14e9-4a19-b51d-2881583289a7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710766 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710775 4742 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e0670252-0ef9-4bec-bd86-a96560faf4d4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710785 4742 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710974 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.710988 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0670252-0ef9-4bec-bd86-a96560faf4d4-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.711010 4742 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.711020 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08112cc7-14e9-4a19-b51d-2881583289a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.711039 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbznv\" (UniqueName: \"kubernetes.io/projected/e0670252-0ef9-4bec-bd86-a96560faf4d4-kube-api-access-nbznv\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.711058 4742 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.711068 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/08112cc7-14e9-4a19-b51d-2881583289a7-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.714313 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cb042f72-1b4d-4f10-aecb-697ad9780b29" (UID: "cb042f72-1b4d-4f10-aecb-697ad9780b29"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.716794 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-scripts" (OuterVolumeSpecName: "scripts") pod "cb042f72-1b4d-4f10-aecb-697ad9780b29" (UID: "cb042f72-1b4d-4f10-aecb-697ad9780b29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.717526 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb042f72-1b4d-4f10-aecb-697ad9780b29-kube-api-access-gbjch" (OuterVolumeSpecName: "kube-api-access-gbjch") pod "cb042f72-1b4d-4f10-aecb-697ad9780b29" (UID: "cb042f72-1b4d-4f10-aecb-697ad9780b29"). InnerVolumeSpecName "kube-api-access-gbjch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.730153 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cb042f72-1b4d-4f10-aecb-697ad9780b29" (UID: "cb042f72-1b4d-4f10-aecb-697ad9780b29"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.730821 4742 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.732066 4742 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.737059 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-config-data" (OuterVolumeSpecName: "config-data") pod "cb042f72-1b4d-4f10-aecb-697ad9780b29" (UID: "cb042f72-1b4d-4f10-aecb-697ad9780b29"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.750366 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb042f72-1b4d-4f10-aecb-697ad9780b29" (UID: "cb042f72-1b4d-4f10-aecb-697ad9780b29"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.813338 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.813375 4742 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.813389 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.813403 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.813419 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbjch\" (UniqueName: \"kubernetes.io/projected/cb042f72-1b4d-4f10-aecb-697ad9780b29-kube-api-access-gbjch\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.813433 4742 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cb042f72-1b4d-4f10-aecb-697ad9780b29-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.813445 4742 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.813458 4742 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.871015 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"08112cc7-14e9-4a19-b51d-2881583289a7","Type":"ContainerDied","Data":"9a2e1d975daf505cd2aa75621db680ca978dd6563ea5851dc0eaae5acc22d55b"} Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.871061 4742 scope.go:117] "RemoveContainer" containerID="e947c5858778c74c81296df563038b05eb9389fc602944ad6e75287ae36cc481" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.871063 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.872798 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-chqr4" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.872846 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-chqr4" event={"ID":"cb042f72-1b4d-4f10-aecb-697ad9780b29","Type":"ContainerDied","Data":"feb70d42951f4c2633fb7aea964a231765460e6cac94872338c44c37aeedc47d"} Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.872887 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feb70d42951f4c2633fb7aea964a231765460e6cac94872338c44c37aeedc47d" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.880296 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e0670252-0ef9-4bec-bd86-a96560faf4d4","Type":"ContainerDied","Data":"6602e57a67978eeab1009cde41d73942573000659b3c38fb861c609e8114649f"} Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.880339 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.947144 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.959444 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.970511 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.986385 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.994747 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:32:41 crc kubenswrapper[4742]: E0317 11:32:41.995377 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0670252-0ef9-4bec-bd86-a96560faf4d4" containerName="glance-log" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.995409 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0670252-0ef9-4bec-bd86-a96560faf4d4" containerName="glance-log" Mar 17 11:32:41 crc kubenswrapper[4742]: E0317 11:32:41.995440 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb042f72-1b4d-4f10-aecb-697ad9780b29" containerName="keystone-bootstrap" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.995452 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb042f72-1b4d-4f10-aecb-697ad9780b29" containerName="keystone-bootstrap" Mar 17 11:32:41 crc kubenswrapper[4742]: E0317 11:32:41.995472 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08112cc7-14e9-4a19-b51d-2881583289a7" containerName="glance-httpd" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.995482 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="08112cc7-14e9-4a19-b51d-2881583289a7" containerName="glance-httpd" Mar 17 11:32:41 crc kubenswrapper[4742]: E0317 11:32:41.995501 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08112cc7-14e9-4a19-b51d-2881583289a7" containerName="glance-log" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.995511 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="08112cc7-14e9-4a19-b51d-2881583289a7" containerName="glance-log" Mar 17 11:32:41 crc kubenswrapper[4742]: E0317 11:32:41.995527 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0670252-0ef9-4bec-bd86-a96560faf4d4" containerName="glance-httpd" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.995538 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0670252-0ef9-4bec-bd86-a96560faf4d4" containerName="glance-httpd" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.995807 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb042f72-1b4d-4f10-aecb-697ad9780b29" containerName="keystone-bootstrap" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.995828 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0670252-0ef9-4bec-bd86-a96560faf4d4" containerName="glance-httpd" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.995862 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0670252-0ef9-4bec-bd86-a96560faf4d4" containerName="glance-log" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.995877 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="08112cc7-14e9-4a19-b51d-2881583289a7" containerName="glance-log" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.995895 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="08112cc7-14e9-4a19-b51d-2881583289a7" containerName="glance-httpd" Mar 17 11:32:41 crc kubenswrapper[4742]: I0317 11:32:41.997397 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.000643 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.000713 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.001000 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rh74z" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.001126 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.012985 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.026865 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.028694 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.032317 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.032557 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.041530 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.124443 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.124489 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.124586 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.124630 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-scripts\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.124736 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.124788 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24b880a5-c4dc-4566-80c3-13fddf078932-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.124882 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d937ab3-6dfb-4b0c-a846-da4820bad05f-logs\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.125119 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d937ab3-6dfb-4b0c-a846-da4820bad05f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.125185 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pck8\" (UniqueName: \"kubernetes.io/projected/5d937ab3-6dfb-4b0c-a846-da4820bad05f-kube-api-access-2pck8\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.125227 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.125247 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.125344 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcqcp\" (UniqueName: \"kubernetes.io/projected/24b880a5-c4dc-4566-80c3-13fddf078932-kube-api-access-rcqcp\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.125383 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.125419 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-config-data\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.125436 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24b880a5-c4dc-4566-80c3-13fddf078932-logs\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.125458 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227003 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-config-data\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227067 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24b880a5-c4dc-4566-80c3-13fddf078932-logs\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227101 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227124 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227150 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227174 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227215 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-scripts\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227243 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227266 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24b880a5-c4dc-4566-80c3-13fddf078932-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227323 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d937ab3-6dfb-4b0c-a846-da4820bad05f-logs\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227410 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d937ab3-6dfb-4b0c-a846-da4820bad05f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227441 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pck8\" (UniqueName: \"kubernetes.io/projected/5d937ab3-6dfb-4b0c-a846-da4820bad05f-kube-api-access-2pck8\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227470 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227491 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227537 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcqcp\" (UniqueName: \"kubernetes.io/projected/24b880a5-c4dc-4566-80c3-13fddf078932-kube-api-access-rcqcp\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227565 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227747 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.227936 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d937ab3-6dfb-4b0c-a846-da4820bad05f-logs\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.228567 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d937ab3-6dfb-4b0c-a846-da4820bad05f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.228602 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24b880a5-c4dc-4566-80c3-13fddf078932-logs\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.229057 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.229549 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24b880a5-c4dc-4566-80c3-13fddf078932-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.232361 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.232988 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-scripts\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.233136 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-scripts\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.233404 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-config-data\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.238654 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.238791 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.239254 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.239469 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-config-data\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.249950 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pck8\" (UniqueName: \"kubernetes.io/projected/5d937ab3-6dfb-4b0c-a846-da4820bad05f-kube-api-access-2pck8\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.257725 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcqcp\" (UniqueName: \"kubernetes.io/projected/24b880a5-c4dc-4566-80c3-13fddf078932-kube-api-access-rcqcp\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.262113 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.272108 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.342457 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.352763 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.631050 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-chqr4"] Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.638812 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-chqr4"] Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.674429 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08112cc7-14e9-4a19-b51d-2881583289a7" path="/var/lib/kubelet/pods/08112cc7-14e9-4a19-b51d-2881583289a7/volumes" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.675641 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb042f72-1b4d-4f10-aecb-697ad9780b29" path="/var/lib/kubelet/pods/cb042f72-1b4d-4f10-aecb-697ad9780b29/volumes" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.676537 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0670252-0ef9-4bec-bd86-a96560faf4d4" path="/var/lib/kubelet/pods/e0670252-0ef9-4bec-bd86-a96560faf4d4/volumes" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.738450 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qf8ng"] Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.739542 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.747468 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qf8ng"] Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.750552 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.750806 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.750990 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.751106 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fjbw9" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.752165 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.836808 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-credential-keys\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.836941 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-config-data\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.837003 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx6cs\" (UniqueName: \"kubernetes.io/projected/a81353e8-6a78-46f3-ae59-028afb88c5ef-kube-api-access-lx6cs\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.837066 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-combined-ca-bundle\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.837126 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-fernet-keys\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.837153 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-scripts\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.938992 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-config-data\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.939078 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx6cs\" (UniqueName: \"kubernetes.io/projected/a81353e8-6a78-46f3-ae59-028afb88c5ef-kube-api-access-lx6cs\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.939140 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-combined-ca-bundle\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.939302 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-fernet-keys\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.939369 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-scripts\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.939481 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-credential-keys\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.942977 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-combined-ca-bundle\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.944114 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-credential-keys\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.944831 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-config-data\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.945809 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-scripts\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.946248 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-fernet-keys\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:42 crc kubenswrapper[4742]: I0317 11:32:42.957170 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx6cs\" (UniqueName: \"kubernetes.io/projected/a81353e8-6a78-46f3-ae59-028afb88c5ef-kube-api-access-lx6cs\") pod \"keystone-bootstrap-qf8ng\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:43 crc kubenswrapper[4742]: I0317 11:32:43.056730 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:43 crc kubenswrapper[4742]: I0317 11:32:43.920765 4742 generic.go:334] "Generic (PLEG): container finished" podID="603a0e75-694a-4ab5-bbe0-616f617bc949" containerID="8a4d5a7aec20b9bffbb6a7e48f746ea28834748dbe582be542f07299629eb0dc" exitCode=0 Mar 17 11:32:43 crc kubenswrapper[4742]: I0317 11:32:43.920850 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7kmxq" event={"ID":"603a0e75-694a-4ab5-bbe0-616f617bc949","Type":"ContainerDied","Data":"8a4d5a7aec20b9bffbb6a7e48f746ea28834748dbe582be542f07299629eb0dc"} Mar 17 11:32:46 crc kubenswrapper[4742]: I0317 11:32:46.797041 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" podUID="1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: i/o timeout" Mar 17 11:32:48 crc kubenswrapper[4742]: I0317 11:32:48.910802 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:48 crc kubenswrapper[4742]: I0317 11:32:48.917861 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:48 crc kubenswrapper[4742]: I0317 11:32:48.992187 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6558577bcc-xrft9" event={"ID":"f848d4a4-4dba-4636-942d-340d83b7750b","Type":"ContainerDied","Data":"17e08657e129d3e0cee1cedb9965e1416c7f09ca8da92ba11f3b7479a45bea30"} Mar 17 11:32:48 crc kubenswrapper[4742]: I0317 11:32:48.992229 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6558577bcc-xrft9" Mar 17 11:32:48 crc kubenswrapper[4742]: I0317 11:32:48.993819 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68d65d5b97-9g9h5" event={"ID":"d2e1f113-344f-4703-9ba2-d4aabebeb1d7","Type":"ContainerDied","Data":"f2df3fe58722e84de899f4186e32edd3451a9ed83c29faa704c45165af53f15d"} Mar 17 11:32:48 crc kubenswrapper[4742]: I0317 11:32:48.995042 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68d65d5b97-9g9h5" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.072679 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f848d4a4-4dba-4636-942d-340d83b7750b-scripts\") pod \"f848d4a4-4dba-4636-942d-340d83b7750b\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.072759 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f848d4a4-4dba-4636-942d-340d83b7750b-logs\") pod \"f848d4a4-4dba-4636-942d-340d83b7750b\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.072955 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jqsd\" (UniqueName: \"kubernetes.io/projected/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-kube-api-access-2jqsd\") pod \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.072994 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f848d4a4-4dba-4636-942d-340d83b7750b-horizon-secret-key\") pod \"f848d4a4-4dba-4636-942d-340d83b7750b\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.073029 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-logs\") pod \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.073132 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f848d4a4-4dba-4636-942d-340d83b7750b-logs" (OuterVolumeSpecName: "logs") pod "f848d4a4-4dba-4636-942d-340d83b7750b" (UID: "f848d4a4-4dba-4636-942d-340d83b7750b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.073145 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-config-data\") pod \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.073179 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-scripts\") pod \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.073260 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-horizon-secret-key\") pod \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\" (UID: \"d2e1f113-344f-4703-9ba2-d4aabebeb1d7\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.073313 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nf5v\" (UniqueName: \"kubernetes.io/projected/f848d4a4-4dba-4636-942d-340d83b7750b-kube-api-access-6nf5v\") pod \"f848d4a4-4dba-4636-942d-340d83b7750b\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.073379 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f848d4a4-4dba-4636-942d-340d83b7750b-config-data\") pod \"f848d4a4-4dba-4636-942d-340d83b7750b\" (UID: \"f848d4a4-4dba-4636-942d-340d83b7750b\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.073510 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f848d4a4-4dba-4636-942d-340d83b7750b-scripts" (OuterVolumeSpecName: "scripts") pod "f848d4a4-4dba-4636-942d-340d83b7750b" (UID: "f848d4a4-4dba-4636-942d-340d83b7750b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.073965 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f848d4a4-4dba-4636-942d-340d83b7750b-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.074000 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f848d4a4-4dba-4636-942d-340d83b7750b-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.074062 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-logs" (OuterVolumeSpecName: "logs") pod "d2e1f113-344f-4703-9ba2-d4aabebeb1d7" (UID: "d2e1f113-344f-4703-9ba2-d4aabebeb1d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.074141 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-scripts" (OuterVolumeSpecName: "scripts") pod "d2e1f113-344f-4703-9ba2-d4aabebeb1d7" (UID: "d2e1f113-344f-4703-9ba2-d4aabebeb1d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.074672 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f848d4a4-4dba-4636-942d-340d83b7750b-config-data" (OuterVolumeSpecName: "config-data") pod "f848d4a4-4dba-4636-942d-340d83b7750b" (UID: "f848d4a4-4dba-4636-942d-340d83b7750b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.074966 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-config-data" (OuterVolumeSpecName: "config-data") pod "d2e1f113-344f-4703-9ba2-d4aabebeb1d7" (UID: "d2e1f113-344f-4703-9ba2-d4aabebeb1d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.077345 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f848d4a4-4dba-4636-942d-340d83b7750b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f848d4a4-4dba-4636-942d-340d83b7750b" (UID: "f848d4a4-4dba-4636-942d-340d83b7750b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.077863 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d2e1f113-344f-4703-9ba2-d4aabebeb1d7" (UID: "d2e1f113-344f-4703-9ba2-d4aabebeb1d7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.082149 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-kube-api-access-2jqsd" (OuterVolumeSpecName: "kube-api-access-2jqsd") pod "d2e1f113-344f-4703-9ba2-d4aabebeb1d7" (UID: "d2e1f113-344f-4703-9ba2-d4aabebeb1d7"). InnerVolumeSpecName "kube-api-access-2jqsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.092345 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f848d4a4-4dba-4636-942d-340d83b7750b-kube-api-access-6nf5v" (OuterVolumeSpecName: "kube-api-access-6nf5v") pod "f848d4a4-4dba-4636-942d-340d83b7750b" (UID: "f848d4a4-4dba-4636-942d-340d83b7750b"). InnerVolumeSpecName "kube-api-access-6nf5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.176333 4742 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.176390 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nf5v\" (UniqueName: \"kubernetes.io/projected/f848d4a4-4dba-4636-942d-340d83b7750b-kube-api-access-6nf5v\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.176406 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f848d4a4-4dba-4636-942d-340d83b7750b-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.176417 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jqsd\" (UniqueName: \"kubernetes.io/projected/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-kube-api-access-2jqsd\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.176425 4742 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f848d4a4-4dba-4636-942d-340d83b7750b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.176434 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.176466 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.176475 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2e1f113-344f-4703-9ba2-d4aabebeb1d7-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.356215 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6558577bcc-xrft9"] Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.372411 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6558577bcc-xrft9"] Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.393445 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-68d65d5b97-9g9h5"] Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.401703 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-68d65d5b97-9g9h5"] Mar 17 11:32:49 crc kubenswrapper[4742]: E0317 11:32:49.507087 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 17 11:32:49 crc kubenswrapper[4742]: E0317 11:32:49.507297 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-96fnv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-cs4pt_openstack(90b52e42-6eca-4585-95a0-057055089c97): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 11:32:49 crc kubenswrapper[4742]: E0317 11:32:49.508659 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-cs4pt" podUID="90b52e42-6eca-4585-95a0-057055089c97" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.588582 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.595682 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7kmxq" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.686484 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603a0e75-694a-4ab5-bbe0-616f617bc949-combined-ca-bundle\") pod \"603a0e75-694a-4ab5-bbe0-616f617bc949\" (UID: \"603a0e75-694a-4ab5-bbe0-616f617bc949\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.686585 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-dns-swift-storage-0\") pod \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.686627 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7pfm\" (UniqueName: \"kubernetes.io/projected/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-kube-api-access-f7pfm\") pod \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.686650 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/603a0e75-694a-4ab5-bbe0-616f617bc949-config\") pod \"603a0e75-694a-4ab5-bbe0-616f617bc949\" (UID: \"603a0e75-694a-4ab5-bbe0-616f617bc949\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.686725 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-dns-svc\") pod \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.686751 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4lzd\" (UniqueName: \"kubernetes.io/projected/603a0e75-694a-4ab5-bbe0-616f617bc949-kube-api-access-b4lzd\") pod \"603a0e75-694a-4ab5-bbe0-616f617bc949\" (UID: \"603a0e75-694a-4ab5-bbe0-616f617bc949\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.686771 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-ovsdbserver-sb\") pod \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.686786 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-config\") pod \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.686863 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-ovsdbserver-nb\") pod \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\" (UID: \"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c\") " Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.690842 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603a0e75-694a-4ab5-bbe0-616f617bc949-kube-api-access-b4lzd" (OuterVolumeSpecName: "kube-api-access-b4lzd") pod "603a0e75-694a-4ab5-bbe0-616f617bc949" (UID: "603a0e75-694a-4ab5-bbe0-616f617bc949"). InnerVolumeSpecName "kube-api-access-b4lzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.691448 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-kube-api-access-f7pfm" (OuterVolumeSpecName: "kube-api-access-f7pfm") pod "1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" (UID: "1f5b97b6-4ed3-4a21-acc9-92a990fbe52c"). InnerVolumeSpecName "kube-api-access-f7pfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.707276 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603a0e75-694a-4ab5-bbe0-616f617bc949-config" (OuterVolumeSpecName: "config") pod "603a0e75-694a-4ab5-bbe0-616f617bc949" (UID: "603a0e75-694a-4ab5-bbe0-616f617bc949"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.712854 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603a0e75-694a-4ab5-bbe0-616f617bc949-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "603a0e75-694a-4ab5-bbe0-616f617bc949" (UID: "603a0e75-694a-4ab5-bbe0-616f617bc949"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.731620 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" (UID: "1f5b97b6-4ed3-4a21-acc9-92a990fbe52c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.737376 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" (UID: "1f5b97b6-4ed3-4a21-acc9-92a990fbe52c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.738470 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" (UID: "1f5b97b6-4ed3-4a21-acc9-92a990fbe52c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.741354 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-config" (OuterVolumeSpecName: "config") pod "1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" (UID: "1f5b97b6-4ed3-4a21-acc9-92a990fbe52c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.742808 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" (UID: "1f5b97b6-4ed3-4a21-acc9-92a990fbe52c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.788554 4742 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.788588 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7pfm\" (UniqueName: \"kubernetes.io/projected/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-kube-api-access-f7pfm\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.788602 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/603a0e75-694a-4ab5-bbe0-616f617bc949-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.788615 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.788626 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4lzd\" (UniqueName: \"kubernetes.io/projected/603a0e75-694a-4ab5-bbe0-616f617bc949-kube-api-access-b4lzd\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.788636 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.788646 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.788656 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:49 crc kubenswrapper[4742]: I0317 11:32:49.788668 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/603a0e75-694a-4ab5-bbe0-616f617bc949-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.007325 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.007363 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" event={"ID":"1f5b97b6-4ed3-4a21-acc9-92a990fbe52c","Type":"ContainerDied","Data":"09d58755e6c9b6d33b0b6d038e61f810e1ca1daee640d3c8eb71418907824ad8"} Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.008973 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7kmxq" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.009542 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7kmxq" event={"ID":"603a0e75-694a-4ab5-bbe0-616f617bc949","Type":"ContainerDied","Data":"eb983291c14ffe6e637c5e5a901977c8be9d712e13ab649764eac90d5bc1499b"} Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.009578 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb983291c14ffe6e637c5e5a901977c8be9d712e13ab649764eac90d5bc1499b" Mar 17 11:32:50 crc kubenswrapper[4742]: E0317 11:32:50.011185 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-cs4pt" podUID="90b52e42-6eca-4585-95a0-057055089c97" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.068210 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-w62xn"] Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.077371 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-w62xn"] Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.678547 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" path="/var/lib/kubelet/pods/1f5b97b6-4ed3-4a21-acc9-92a990fbe52c/volumes" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.681548 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2e1f113-344f-4703-9ba2-d4aabebeb1d7" path="/var/lib/kubelet/pods/d2e1f113-344f-4703-9ba2-d4aabebeb1d7/volumes" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.682472 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f848d4a4-4dba-4636-942d-340d83b7750b" path="/var/lib/kubelet/pods/f848d4a4-4dba-4636-942d-340d83b7750b/volumes" Mar 17 11:32:50 crc kubenswrapper[4742]: E0317 11:32:50.715956 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 17 11:32:50 crc kubenswrapper[4742]: E0317 11:32:50.716132 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fxtkz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-qzc74_openstack(5c75af6d-6842-49b5-aebe-54feb0644942): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 11:32:50 crc kubenswrapper[4742]: E0317 11:32:50.717276 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-qzc74" podUID="5c75af6d-6842-49b5-aebe-54feb0644942" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.735880 4742 scope.go:117] "RemoveContainer" containerID="2d21744bd7eebff5a8be8f9f5e1aa21e2d81803774869f771ed707a1839c7905" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.930208 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-pxjc2"] Mar 17 11:32:50 crc kubenswrapper[4742]: E0317 11:32:50.930842 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603a0e75-694a-4ab5-bbe0-616f617bc949" containerName="neutron-db-sync" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.930860 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="603a0e75-694a-4ab5-bbe0-616f617bc949" containerName="neutron-db-sync" Mar 17 11:32:50 crc kubenswrapper[4742]: E0317 11:32:50.930873 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" containerName="init" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.930879 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" containerName="init" Mar 17 11:32:50 crc kubenswrapper[4742]: E0317 11:32:50.930887 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" containerName="dnsmasq-dns" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.930893 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" containerName="dnsmasq-dns" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.931160 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" containerName="dnsmasq-dns" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.931173 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="603a0e75-694a-4ab5-bbe0-616f617bc949" containerName="neutron-db-sync" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.932082 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.958540 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-pxjc2"] Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.963075 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9d44b9d7b-r5znz"] Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.964567 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.971710 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9d44b9d7b-r5znz"] Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.976586 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-ptsjt" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.976769 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.976930 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 17 11:32:50 crc kubenswrapper[4742]: I0317 11:32:50.976989 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 17 11:32:51 crc kubenswrapper[4742]: E0317 11:32:51.052936 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-qzc74" podUID="5c75af6d-6842-49b5-aebe-54feb0644942" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.111846 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.111987 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-dns-svc\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.112028 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.112052 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6f8m\" (UniqueName: \"kubernetes.io/projected/6275127e-0ae2-4d23-8592-ba85c3a7661b-kube-api-access-s6f8m\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.112105 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-ovndb-tls-certs\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.112121 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-config\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.112140 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-combined-ca-bundle\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.112171 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt2z9\" (UniqueName: \"kubernetes.io/projected/69896e76-e60c-4941-b013-a702791923ec-kube-api-access-vt2z9\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.112197 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.112237 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-config\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.112261 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-httpd-config\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.127220 4742 scope.go:117] "RemoveContainer" containerID="374d05242d8d6a4be6204eb855b905eaa2bbf7f53aacf4468460a8fdd83164bc" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.201392 4742 scope.go:117] "RemoveContainer" containerID="972b9842c2771d49514f2bb33841fdda64b2d5f3606bf6cf485d924353397228" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.213449 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt2z9\" (UniqueName: \"kubernetes.io/projected/69896e76-e60c-4941-b013-a702791923ec-kube-api-access-vt2z9\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.213506 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.213566 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-config\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.213589 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-httpd-config\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.213615 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.213667 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-dns-svc\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.213709 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.213736 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6f8m\" (UniqueName: \"kubernetes.io/projected/6275127e-0ae2-4d23-8592-ba85c3a7661b-kube-api-access-s6f8m\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.213800 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-ovndb-tls-certs\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.213823 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-config\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.213849 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-combined-ca-bundle\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.215636 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.216761 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.218308 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-dns-svc\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.219197 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-config\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.219650 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.223781 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-config\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.226953 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-httpd-config\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.231785 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-combined-ca-bundle\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.241504 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-ovndb-tls-certs\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.241721 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt2z9\" (UniqueName: \"kubernetes.io/projected/69896e76-e60c-4941-b013-a702791923ec-kube-api-access-vt2z9\") pod \"neutron-9d44b9d7b-r5znz\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.246929 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6f8m\" (UniqueName: \"kubernetes.io/projected/6275127e-0ae2-4d23-8592-ba85c3a7661b-kube-api-access-s6f8m\") pod \"dnsmasq-dns-55f844cf75-pxjc2\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.375432 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cbc75d594-mvhf5"] Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.376179 4742 scope.go:117] "RemoveContainer" containerID="821f7b0c9ffb1c5b507d88792fc4fdc096a2cfb7a0a31e127fe30419b71a81af" Mar 17 11:32:51 crc kubenswrapper[4742]: W0317 11:32:51.382987 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2bbef92_cd02_42d8_b81d_ab7248e29328.slice/crio-f7ff99f1880036e2d5976ac2a62e3410d8bb4ed86b7d89695323874b86be177c WatchSource:0}: Error finding container f7ff99f1880036e2d5976ac2a62e3410d8bb4ed86b7d89695323874b86be177c: Status 404 returned error can't find the container with id f7ff99f1880036e2d5976ac2a62e3410d8bb4ed86b7d89695323874b86be177c Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.406585 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.411331 4742 scope.go:117] "RemoveContainer" containerID="f62f2d9b2ea480fb5376fcc72cb6afe1ec837e34900114bff50803a1ce8cbc20" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.491600 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.674151 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wjx6c"] Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.685868 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.739123 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:32:51 crc kubenswrapper[4742]: W0317 11:32:51.740386 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d937ab3_6dfb_4b0c_a846_da4820bad05f.slice/crio-1559594b60e090c6fda96076e1e12cbefd8b566ffab8c5b3b508707d9ffd7d64 WatchSource:0}: Error finding container 1559594b60e090c6fda96076e1e12cbefd8b566ffab8c5b3b508707d9ffd7d64: Status 404 returned error can't find the container with id 1559594b60e090c6fda96076e1e12cbefd8b566ffab8c5b3b508707d9ffd7d64 Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.798649 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-w62xn" podUID="1f5b97b6-4ed3-4a21-acc9-92a990fbe52c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.133:5353: i/o timeout" Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.819454 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qf8ng"] Mar 17 11:32:51 crc kubenswrapper[4742]: I0317 11:32:51.831549 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c4556b444-kq454"] Mar 17 11:32:51 crc kubenswrapper[4742]: W0317 11:32:51.844522 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda81353e8_6a78_46f3_ae59_028afb88c5ef.slice/crio-fd7795d4c5e1142dd1351ed22a55f6680cd55a66df3d15bb89b6b1d6405e0ec4 WatchSource:0}: Error finding container fd7795d4c5e1142dd1351ed22a55f6680cd55a66df3d15bb89b6b1d6405e0ec4: Status 404 returned error can't find the container with id fd7795d4c5e1142dd1351ed22a55f6680cd55a66df3d15bb89b6b1d6405e0ec4 Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.055024 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wjx6c" event={"ID":"23024865-2dad-4a51-af8b-7d7a224c8ce8","Type":"ContainerStarted","Data":"7c6c0d952f237a67f8d73cd492a5849ea114b6e55c0029547f2fcaedaa3a4ab4"} Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.055365 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wjx6c" event={"ID":"23024865-2dad-4a51-af8b-7d7a224c8ce8","Type":"ContainerStarted","Data":"e66dc6a1397a0a2ece9e4a02926c2aa27e1de0a639c662aaba408b0263131059"} Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.067573 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-647cff84fc-lltcg" event={"ID":"4eb7c4c3-2dad-464d-8e2c-09e618d140e4","Type":"ContainerStarted","Data":"08a901b0036e08829aab4952f6da9de41d1d594fd0633eb6a7725734021ddeb4"} Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.067599 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-647cff84fc-lltcg" event={"ID":"4eb7c4c3-2dad-464d-8e2c-09e618d140e4","Type":"ContainerStarted","Data":"52611f020d25ddf92efdb5a2f6d79051f256d01eebacede571cf7a38a89a529b"} Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.067686 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-647cff84fc-lltcg" podUID="4eb7c4c3-2dad-464d-8e2c-09e618d140e4" containerName="horizon-log" containerID="cri-o://52611f020d25ddf92efdb5a2f6d79051f256d01eebacede571cf7a38a89a529b" gracePeriod=30 Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.067754 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-647cff84fc-lltcg" podUID="4eb7c4c3-2dad-464d-8e2c-09e618d140e4" containerName="horizon" containerID="cri-o://08a901b0036e08829aab4952f6da9de41d1d594fd0633eb6a7725734021ddeb4" gracePeriod=30 Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.078256 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-wjx6c" podStartSLOduration=17.07824183 podStartE2EDuration="17.07824183s" podCreationTimestamp="2026-03-17 11:32:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:52.075520185 +0000 UTC m=+1275.201647943" watchObservedRunningTime="2026-03-17 11:32:52.07824183 +0000 UTC m=+1275.204369588" Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.113877 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbc75d594-mvhf5" event={"ID":"f2bbef92-cd02-42d8-b81d-ab7248e29328","Type":"ContainerStarted","Data":"ae60630080e9cc570d98471ac0c31f7d1dbbc55f4cd0bf7020ef89cc50cc5e24"} Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.113945 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbc75d594-mvhf5" event={"ID":"f2bbef92-cd02-42d8-b81d-ab7248e29328","Type":"ContainerStarted","Data":"f7ff99f1880036e2d5976ac2a62e3410d8bb4ed86b7d89695323874b86be177c"} Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.123075 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9d44b9d7b-r5znz"] Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.131091 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-647cff84fc-lltcg" podStartSLOduration=2.7379657379999998 podStartE2EDuration="27.131074661s" podCreationTimestamp="2026-03-17 11:32:25 +0000 UTC" firstStartedPulling="2026-03-17 11:32:26.255081634 +0000 UTC m=+1249.381209392" lastFinishedPulling="2026-03-17 11:32:50.648190547 +0000 UTC m=+1273.774318315" observedRunningTime="2026-03-17 11:32:52.103691499 +0000 UTC m=+1275.229819257" watchObservedRunningTime="2026-03-17 11:32:52.131074661 +0000 UTC m=+1275.257202419" Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.132373 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfcf738-372c-42d4-a4b0-c1f88be1dd43","Type":"ContainerStarted","Data":"623151fda616fa23271be1c022ca456476faf88a7e67839a6966fde857b2fa0d"} Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.143064 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d937ab3-6dfb-4b0c-a846-da4820bad05f","Type":"ContainerStarted","Data":"1559594b60e090c6fda96076e1e12cbefd8b566ffab8c5b3b508707d9ffd7d64"} Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.144211 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c4556b444-kq454" event={"ID":"480fea20-eab5-4c68-9bc3-9b218ba0b43d","Type":"ContainerStarted","Data":"de9e3dc64059df24aabbdc56daa6858161bb164d4d0bfa763050c221e312eb64"} Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.145242 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7mmzn" event={"ID":"e3261b59-fc08-4596-bda8-7b398ef979e4","Type":"ContainerStarted","Data":"cb6c5cccbe986bd1e0e45082009a082d04c991d8ad6d2ed9bb320e5975d56e31"} Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.154053 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qf8ng" event={"ID":"a81353e8-6a78-46f3-ae59-028afb88c5ef","Type":"ContainerStarted","Data":"fd7795d4c5e1142dd1351ed22a55f6680cd55a66df3d15bb89b6b1d6405e0ec4"} Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.201442 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-pxjc2"] Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.226570 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-7mmzn" podStartSLOduration=5.954272945 podStartE2EDuration="30.226552522s" podCreationTimestamp="2026-03-17 11:32:22 +0000 UTC" firstStartedPulling="2026-03-17 11:32:24.540794415 +0000 UTC m=+1247.666922173" lastFinishedPulling="2026-03-17 11:32:48.813073982 +0000 UTC m=+1271.939201750" observedRunningTime="2026-03-17 11:32:52.192274246 +0000 UTC m=+1275.318402004" watchObservedRunningTime="2026-03-17 11:32:52.226552522 +0000 UTC m=+1275.352680280" Mar 17 11:32:52 crc kubenswrapper[4742]: I0317 11:32:52.592460 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.189709 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d937ab3-6dfb-4b0c-a846-da4820bad05f","Type":"ContainerStarted","Data":"b16a1e0dea5c35e5a713c52318aa554fbd5a170e6b1beda799e52b7f33cd1c7b"} Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.213014 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wjx6c" event={"ID":"23024865-2dad-4a51-af8b-7d7a224c8ce8","Type":"ContainerDied","Data":"7c6c0d952f237a67f8d73cd492a5849ea114b6e55c0029547f2fcaedaa3a4ab4"} Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.213041 4742 generic.go:334] "Generic (PLEG): container finished" podID="23024865-2dad-4a51-af8b-7d7a224c8ce8" containerID="7c6c0d952f237a67f8d73cd492a5849ea114b6e55c0029547f2fcaedaa3a4ab4" exitCode=0 Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.220309 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d44b9d7b-r5znz" event={"ID":"69896e76-e60c-4941-b013-a702791923ec","Type":"ContainerStarted","Data":"831de658b40e525dec4242bdf49d13b3b0065da3ec5d4638a8e9dc7531098e33"} Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.220348 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d44b9d7b-r5znz" event={"ID":"69896e76-e60c-4941-b013-a702791923ec","Type":"ContainerStarted","Data":"4fc0b50a962ed4dea4c4d507eee8424247b69b59a9f391dc5f262e36cfcffcf0"} Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.224893 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-774cd45c89-tc5lr"] Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.240194 4742 generic.go:334] "Generic (PLEG): container finished" podID="6275127e-0ae2-4d23-8592-ba85c3a7661b" containerID="d6e60441db6bd455f12c0cd16ef3b9173ae95b2c9f39155088fdee419abb3bfe" exitCode=0 Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.241605 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c4556b444-kq454" event={"ID":"480fea20-eab5-4c68-9bc3-9b218ba0b43d","Type":"ContainerStarted","Data":"4b5d5b68d828968e315e14066de2d3e6c1c0ab2dfea022f4f726cb3fabd01f1e"} Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.241648 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c4556b444-kq454" event={"ID":"480fea20-eab5-4c68-9bc3-9b218ba0b43d","Type":"ContainerStarted","Data":"6bf61a1e6614b6c68647c1658d4a9c6132c7cee51994cf3e7e6180615074d6dd"} Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.241660 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-774cd45c89-tc5lr"] Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.241673 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" event={"ID":"6275127e-0ae2-4d23-8592-ba85c3a7661b","Type":"ContainerDied","Data":"d6e60441db6bd455f12c0cd16ef3b9173ae95b2c9f39155088fdee419abb3bfe"} Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.241685 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" event={"ID":"6275127e-0ae2-4d23-8592-ba85c3a7661b","Type":"ContainerStarted","Data":"14973ac7c518e0e6a78e0274abc70f735cdf1541353e7b74f225d9b2a8d35f16"} Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.241771 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.246121 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.246376 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.246584 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qf8ng" event={"ID":"a81353e8-6a78-46f3-ae59-028afb88c5ef","Type":"ContainerStarted","Data":"0aa271424e124473fa5166acc53c483de31ee97380a2e1b5a179eee85102ec11"} Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.277435 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-combined-ca-bundle\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.277536 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6njh\" (UniqueName: \"kubernetes.io/projected/ca9f66f5-5921-4f35-a45a-0de69f1a3434-kube-api-access-m6njh\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.277565 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-internal-tls-certs\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.277597 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-config\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.277669 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-httpd-config\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.277695 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-public-tls-certs\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.277717 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-ovndb-tls-certs\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.283669 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5c4556b444-kq454" podStartSLOduration=22.283651427 podStartE2EDuration="22.283651427s" podCreationTimestamp="2026-03-17 11:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:53.262323182 +0000 UTC m=+1276.388450940" watchObservedRunningTime="2026-03-17 11:32:53.283651427 +0000 UTC m=+1276.409779175" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.294516 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbc75d594-mvhf5" event={"ID":"f2bbef92-cd02-42d8-b81d-ab7248e29328","Type":"ContainerStarted","Data":"bdba66d58b80df369c99a5b4d40c5a7e87b7ad31620cffac6b75800835ccef63"} Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.302587 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24b880a5-c4dc-4566-80c3-13fddf078932","Type":"ContainerStarted","Data":"e3e380aa34e43555ff3995abee65e5a83cc2a4dd6a3333c2f7d9c6f078af009d"} Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.307634 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qf8ng" podStartSLOduration=11.307623224 podStartE2EDuration="11.307623224s" podCreationTimestamp="2026-03-17 11:32:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:53.299039305 +0000 UTC m=+1276.425167063" watchObservedRunningTime="2026-03-17 11:32:53.307623224 +0000 UTC m=+1276.433750982" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.354625 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5cbc75d594-mvhf5" podStartSLOduration=22.354603622 podStartE2EDuration="22.354603622s" podCreationTimestamp="2026-03-17 11:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:53.347204296 +0000 UTC m=+1276.473332054" watchObservedRunningTime="2026-03-17 11:32:53.354603622 +0000 UTC m=+1276.480731400" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.379814 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6njh\" (UniqueName: \"kubernetes.io/projected/ca9f66f5-5921-4f35-a45a-0de69f1a3434-kube-api-access-m6njh\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.379879 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-internal-tls-certs\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.379936 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-config\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.380005 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-httpd-config\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.380020 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-public-tls-certs\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.380047 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-ovndb-tls-certs\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.380132 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-combined-ca-bundle\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.387362 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-httpd-config\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.390617 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-public-tls-certs\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.391510 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-internal-tls-certs\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.394024 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-combined-ca-bundle\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.395054 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-config\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.395366 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6njh\" (UniqueName: \"kubernetes.io/projected/ca9f66f5-5921-4f35-a45a-0de69f1a3434-kube-api-access-m6njh\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.412826 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-ovndb-tls-certs\") pod \"neutron-774cd45c89-tc5lr\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:53 crc kubenswrapper[4742]: I0317 11:32:53.576578 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.373542 4742 generic.go:334] "Generic (PLEG): container finished" podID="e3261b59-fc08-4596-bda8-7b398ef979e4" containerID="cb6c5cccbe986bd1e0e45082009a082d04c991d8ad6d2ed9bb320e5975d56e31" exitCode=0 Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.373668 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7mmzn" event={"ID":"e3261b59-fc08-4596-bda8-7b398ef979e4","Type":"ContainerDied","Data":"cb6c5cccbe986bd1e0e45082009a082d04c991d8ad6d2ed9bb320e5975d56e31"} Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.392646 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24b880a5-c4dc-4566-80c3-13fddf078932","Type":"ContainerStarted","Data":"21dfba13d6ea06b5e5b7ca6e464fcedbd33bca247f133278945a35a27374bda5"} Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.406147 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d937ab3-6dfb-4b0c-a846-da4820bad05f","Type":"ContainerStarted","Data":"471b1b0b9bc05fd61003bdefb4957ca8755e2dd3462d5fa147269aa7e6143ecf"} Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.415895 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d44b9d7b-r5znz" event={"ID":"69896e76-e60c-4941-b013-a702791923ec","Type":"ContainerStarted","Data":"20d2c94ecc394709e98a453bbe007a230f2c3d20068fb621f4384bce882f1336"} Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.431556 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=13.431532279 podStartE2EDuration="13.431532279s" podCreationTimestamp="2026-03-17 11:32:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:54.427985011 +0000 UTC m=+1277.554112769" watchObservedRunningTime="2026-03-17 11:32:54.431532279 +0000 UTC m=+1277.557660037" Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.475130 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9d44b9d7b-r5znz" podStartSLOduration=4.475112194 podStartE2EDuration="4.475112194s" podCreationTimestamp="2026-03-17 11:32:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:54.467351687 +0000 UTC m=+1277.593479455" watchObservedRunningTime="2026-03-17 11:32:54.475112194 +0000 UTC m=+1277.601239952" Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.785423 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-774cd45c89-tc5lr"] Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.814631 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wjx6c" Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.920319 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23024865-2dad-4a51-af8b-7d7a224c8ce8-operator-scripts\") pod \"23024865-2dad-4a51-af8b-7d7a224c8ce8\" (UID: \"23024865-2dad-4a51-af8b-7d7a224c8ce8\") " Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.920453 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxjsb\" (UniqueName: \"kubernetes.io/projected/23024865-2dad-4a51-af8b-7d7a224c8ce8-kube-api-access-rxjsb\") pod \"23024865-2dad-4a51-af8b-7d7a224c8ce8\" (UID: \"23024865-2dad-4a51-af8b-7d7a224c8ce8\") " Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.921028 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23024865-2dad-4a51-af8b-7d7a224c8ce8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23024865-2dad-4a51-af8b-7d7a224c8ce8" (UID: "23024865-2dad-4a51-af8b-7d7a224c8ce8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.922423 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23024865-2dad-4a51-af8b-7d7a224c8ce8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:54 crc kubenswrapper[4742]: I0317 11:32:54.927764 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23024865-2dad-4a51-af8b-7d7a224c8ce8-kube-api-access-rxjsb" (OuterVolumeSpecName: "kube-api-access-rxjsb") pod "23024865-2dad-4a51-af8b-7d7a224c8ce8" (UID: "23024865-2dad-4a51-af8b-7d7a224c8ce8"). InnerVolumeSpecName "kube-api-access-rxjsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.024444 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxjsb\" (UniqueName: \"kubernetes.io/projected/23024865-2dad-4a51-af8b-7d7a224c8ce8-kube-api-access-rxjsb\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.424832 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24b880a5-c4dc-4566-80c3-13fddf078932","Type":"ContainerStarted","Data":"8f9fd3cf05b1b274b0195ccc59bfc69550a213322d5dfbdf127ca7bc53d87c06"} Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.427419 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wjx6c" event={"ID":"23024865-2dad-4a51-af8b-7d7a224c8ce8","Type":"ContainerDied","Data":"e66dc6a1397a0a2ece9e4a02926c2aa27e1de0a639c662aaba408b0263131059"} Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.427522 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e66dc6a1397a0a2ece9e4a02926c2aa27e1de0a639c662aaba408b0263131059" Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.427625 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wjx6c" Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.429722 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfcf738-372c-42d4-a4b0-c1f88be1dd43","Type":"ContainerStarted","Data":"28894944f3e0b9ab2c2069b968405fd3eb1d38909532b97ffb561b4777f86dbf"} Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.431794 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-774cd45c89-tc5lr" event={"ID":"ca9f66f5-5921-4f35-a45a-0de69f1a3434","Type":"ContainerStarted","Data":"50df2e1afbc3ba5e5be7c6cf914d4d34d955758f6abc13b7f49af5e247bff50c"} Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.431886 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.431898 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-774cd45c89-tc5lr" event={"ID":"ca9f66f5-5921-4f35-a45a-0de69f1a3434","Type":"ContainerStarted","Data":"9cbb8d608c9da7785459131e7c950b27ff8bb5a50ff50434ae31b524c78ba95d"} Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.431922 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-774cd45c89-tc5lr" event={"ID":"ca9f66f5-5921-4f35-a45a-0de69f1a3434","Type":"ContainerStarted","Data":"169676225e070c8b5d3d0453469f9c25590bc2576aafb8b58c5b3a187de378e8"} Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.433838 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" event={"ID":"6275127e-0ae2-4d23-8592-ba85c3a7661b","Type":"ContainerStarted","Data":"4f71c848c99a2374408a16b27e0a2bc9433f250c6dd3ea01e5171e0d2d8cb765"} Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.434369 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.434591 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.449691 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.449677299 podStartE2EDuration="14.449677299s" podCreationTimestamp="2026-03-17 11:32:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:55.446594933 +0000 UTC m=+1278.572722701" watchObservedRunningTime="2026-03-17 11:32:55.449677299 +0000 UTC m=+1278.575805057" Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.481591 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" podStartSLOduration=5.481574477 podStartE2EDuration="5.481574477s" podCreationTimestamp="2026-03-17 11:32:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:55.476096034 +0000 UTC m=+1278.602223792" watchObservedRunningTime="2026-03-17 11:32:55.481574477 +0000 UTC m=+1278.607702235" Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.503645 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-774cd45c89-tc5lr" podStartSLOduration=2.503627572 podStartE2EDuration="2.503627572s" podCreationTimestamp="2026-03-17 11:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:55.500120394 +0000 UTC m=+1278.626248152" watchObservedRunningTime="2026-03-17 11:32:55.503627572 +0000 UTC m=+1278.629755330" Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.587075 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:32:55 crc kubenswrapper[4742]: I0317 11:32:55.964098 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.147179 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mmvj\" (UniqueName: \"kubernetes.io/projected/e3261b59-fc08-4596-bda8-7b398ef979e4-kube-api-access-5mmvj\") pod \"e3261b59-fc08-4596-bda8-7b398ef979e4\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.147626 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3261b59-fc08-4596-bda8-7b398ef979e4-logs\") pod \"e3261b59-fc08-4596-bda8-7b398ef979e4\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.147666 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-combined-ca-bundle\") pod \"e3261b59-fc08-4596-bda8-7b398ef979e4\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.147688 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-scripts\") pod \"e3261b59-fc08-4596-bda8-7b398ef979e4\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.147712 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-config-data\") pod \"e3261b59-fc08-4596-bda8-7b398ef979e4\" (UID: \"e3261b59-fc08-4596-bda8-7b398ef979e4\") " Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.151315 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3261b59-fc08-4596-bda8-7b398ef979e4-logs" (OuterVolumeSpecName: "logs") pod "e3261b59-fc08-4596-bda8-7b398ef979e4" (UID: "e3261b59-fc08-4596-bda8-7b398ef979e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.155066 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3261b59-fc08-4596-bda8-7b398ef979e4-kube-api-access-5mmvj" (OuterVolumeSpecName: "kube-api-access-5mmvj") pod "e3261b59-fc08-4596-bda8-7b398ef979e4" (UID: "e3261b59-fc08-4596-bda8-7b398ef979e4"). InnerVolumeSpecName "kube-api-access-5mmvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.190076 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-scripts" (OuterVolumeSpecName: "scripts") pod "e3261b59-fc08-4596-bda8-7b398ef979e4" (UID: "e3261b59-fc08-4596-bda8-7b398ef979e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.206071 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3261b59-fc08-4596-bda8-7b398ef979e4" (UID: "e3261b59-fc08-4596-bda8-7b398ef979e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.213207 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-config-data" (OuterVolumeSpecName: "config-data") pod "e3261b59-fc08-4596-bda8-7b398ef979e4" (UID: "e3261b59-fc08-4596-bda8-7b398ef979e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.249960 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3261b59-fc08-4596-bda8-7b398ef979e4-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.250003 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.250017 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.250029 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3261b59-fc08-4596-bda8-7b398ef979e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.250043 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mmvj\" (UniqueName: \"kubernetes.io/projected/e3261b59-fc08-4596-bda8-7b398ef979e4-kube-api-access-5mmvj\") on node \"crc\" DevicePath \"\"" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.446985 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7mmzn" event={"ID":"e3261b59-fc08-4596-bda8-7b398ef979e4","Type":"ContainerDied","Data":"99fbe9421a6a14651f763389679f8080afc67c31693d12d9a7ff25ca638692f5"} Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.447045 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99fbe9421a6a14651f763389679f8080afc67c31693d12d9a7ff25ca638692f5" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.450211 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7mmzn" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.513832 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5485d7d4fb-62qtm"] Mar 17 11:32:56 crc kubenswrapper[4742]: E0317 11:32:56.514411 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23024865-2dad-4a51-af8b-7d7a224c8ce8" containerName="mariadb-account-create-update" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.514495 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="23024865-2dad-4a51-af8b-7d7a224c8ce8" containerName="mariadb-account-create-update" Mar 17 11:32:56 crc kubenswrapper[4742]: E0317 11:32:56.514579 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3261b59-fc08-4596-bda8-7b398ef979e4" containerName="placement-db-sync" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.514632 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3261b59-fc08-4596-bda8-7b398ef979e4" containerName="placement-db-sync" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.514888 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3261b59-fc08-4596-bda8-7b398ef979e4" containerName="placement-db-sync" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.514994 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="23024865-2dad-4a51-af8b-7d7a224c8ce8" containerName="mariadb-account-create-update" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.515929 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.519773 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.522372 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.522586 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.522674 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.522861 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tcgpz" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.554939 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5485d7d4fb-62qtm"] Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.659964 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b078827f-c462-4bbb-8d77-06a978218545-logs\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.660031 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-public-tls-certs\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.660072 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-scripts\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.660126 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-config-data\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.660176 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsrf6\" (UniqueName: \"kubernetes.io/projected/b078827f-c462-4bbb-8d77-06a978218545-kube-api-access-jsrf6\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.660228 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-internal-tls-certs\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.660269 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-combined-ca-bundle\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.763967 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b078827f-c462-4bbb-8d77-06a978218545-logs\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.764007 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-public-tls-certs\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.764041 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-scripts\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.764080 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-config-data\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.764114 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsrf6\" (UniqueName: \"kubernetes.io/projected/b078827f-c462-4bbb-8d77-06a978218545-kube-api-access-jsrf6\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.764159 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-internal-tls-certs\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.764189 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-combined-ca-bundle\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: E0317 11:32:56.764322 4742 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3261b59_fc08_4596_bda8_7b398ef979e4.slice\": RecentStats: unable to find data in memory cache]" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.766125 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b078827f-c462-4bbb-8d77-06a978218545-logs\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.770412 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-internal-tls-certs\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.771940 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-scripts\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.774986 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-combined-ca-bundle\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.778432 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-public-tls-certs\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.779595 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-config-data\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.783429 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsrf6\" (UniqueName: \"kubernetes.io/projected/b078827f-c462-4bbb-8d77-06a978218545-kube-api-access-jsrf6\") pod \"placement-5485d7d4fb-62qtm\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:56 crc kubenswrapper[4742]: I0317 11:32:56.854455 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:57 crc kubenswrapper[4742]: I0317 11:32:57.400721 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5485d7d4fb-62qtm"] Mar 17 11:32:57 crc kubenswrapper[4742]: W0317 11:32:57.409858 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb078827f_c462_4bbb_8d77_06a978218545.slice/crio-8cbf74bf60be739cab962c034fc899f41be1478339cef37bfbddfb78ef382cb2 WatchSource:0}: Error finding container 8cbf74bf60be739cab962c034fc899f41be1478339cef37bfbddfb78ef382cb2: Status 404 returned error can't find the container with id 8cbf74bf60be739cab962c034fc899f41be1478339cef37bfbddfb78ef382cb2 Mar 17 11:32:57 crc kubenswrapper[4742]: I0317 11:32:57.456685 4742 generic.go:334] "Generic (PLEG): container finished" podID="a81353e8-6a78-46f3-ae59-028afb88c5ef" containerID="0aa271424e124473fa5166acc53c483de31ee97380a2e1b5a179eee85102ec11" exitCode=0 Mar 17 11:32:57 crc kubenswrapper[4742]: I0317 11:32:57.456724 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qf8ng" event={"ID":"a81353e8-6a78-46f3-ae59-028afb88c5ef","Type":"ContainerDied","Data":"0aa271424e124473fa5166acc53c483de31ee97380a2e1b5a179eee85102ec11"} Mar 17 11:32:57 crc kubenswrapper[4742]: I0317 11:32:57.458439 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5485d7d4fb-62qtm" event={"ID":"b078827f-c462-4bbb-8d77-06a978218545","Type":"ContainerStarted","Data":"8cbf74bf60be739cab962c034fc899f41be1478339cef37bfbddfb78ef382cb2"} Mar 17 11:32:58 crc kubenswrapper[4742]: I0317 11:32:58.470491 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5485d7d4fb-62qtm" event={"ID":"b078827f-c462-4bbb-8d77-06a978218545","Type":"ContainerStarted","Data":"3325ce256720c2da33849b80aa0da173fb655eaaba82d00416cacdf1b1b8e0f6"} Mar 17 11:32:58 crc kubenswrapper[4742]: I0317 11:32:58.470779 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5485d7d4fb-62qtm" event={"ID":"b078827f-c462-4bbb-8d77-06a978218545","Type":"ContainerStarted","Data":"ce8b09bc00016b52c043dd68c02387a5a892be022880d1ac0dc4c2353c9a165b"} Mar 17 11:32:58 crc kubenswrapper[4742]: I0317 11:32:58.487760 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5485d7d4fb-62qtm" podStartSLOduration=2.487741762 podStartE2EDuration="2.487741762s" podCreationTimestamp="2026-03-17 11:32:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:32:58.487057753 +0000 UTC m=+1281.613185511" watchObservedRunningTime="2026-03-17 11:32:58.487741762 +0000 UTC m=+1281.613869510" Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.478348 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.478682 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.760680 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.935320 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx6cs\" (UniqueName: \"kubernetes.io/projected/a81353e8-6a78-46f3-ae59-028afb88c5ef-kube-api-access-lx6cs\") pod \"a81353e8-6a78-46f3-ae59-028afb88c5ef\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.935724 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-fernet-keys\") pod \"a81353e8-6a78-46f3-ae59-028afb88c5ef\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.935780 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-scripts\") pod \"a81353e8-6a78-46f3-ae59-028afb88c5ef\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.935846 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-config-data\") pod \"a81353e8-6a78-46f3-ae59-028afb88c5ef\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.935890 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-combined-ca-bundle\") pod \"a81353e8-6a78-46f3-ae59-028afb88c5ef\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.935988 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-credential-keys\") pod \"a81353e8-6a78-46f3-ae59-028afb88c5ef\" (UID: \"a81353e8-6a78-46f3-ae59-028afb88c5ef\") " Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.942132 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a81353e8-6a78-46f3-ae59-028afb88c5ef" (UID: "a81353e8-6a78-46f3-ae59-028afb88c5ef"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.948026 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a81353e8-6a78-46f3-ae59-028afb88c5ef" (UID: "a81353e8-6a78-46f3-ae59-028afb88c5ef"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.950115 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81353e8-6a78-46f3-ae59-028afb88c5ef-kube-api-access-lx6cs" (OuterVolumeSpecName: "kube-api-access-lx6cs") pod "a81353e8-6a78-46f3-ae59-028afb88c5ef" (UID: "a81353e8-6a78-46f3-ae59-028afb88c5ef"). InnerVolumeSpecName "kube-api-access-lx6cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.970621 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-scripts" (OuterVolumeSpecName: "scripts") pod "a81353e8-6a78-46f3-ae59-028afb88c5ef" (UID: "a81353e8-6a78-46f3-ae59-028afb88c5ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:32:59 crc kubenswrapper[4742]: I0317 11:32:59.987860 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-config-data" (OuterVolumeSpecName: "config-data") pod "a81353e8-6a78-46f3-ae59-028afb88c5ef" (UID: "a81353e8-6a78-46f3-ae59-028afb88c5ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.027426 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a81353e8-6a78-46f3-ae59-028afb88c5ef" (UID: "a81353e8-6a78-46f3-ae59-028afb88c5ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.051244 4742 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.051282 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.051292 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.051308 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.051319 4742 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a81353e8-6a78-46f3-ae59-028afb88c5ef-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.051329 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx6cs\" (UniqueName: \"kubernetes.io/projected/a81353e8-6a78-46f3-ae59-028afb88c5ef-kube-api-access-lx6cs\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.487674 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qf8ng" event={"ID":"a81353e8-6a78-46f3-ae59-028afb88c5ef","Type":"ContainerDied","Data":"fd7795d4c5e1142dd1351ed22a55f6680cd55a66df3d15bb89b6b1d6405e0ec4"} Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.487753 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd7795d4c5e1142dd1351ed22a55f6680cd55a66df3d15bb89b6b1d6405e0ec4" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.487697 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qf8ng" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.872799 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cf69c6b9b-d9hmq"] Mar 17 11:33:00 crc kubenswrapper[4742]: E0317 11:33:00.873224 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81353e8-6a78-46f3-ae59-028afb88c5ef" containerName="keystone-bootstrap" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.873257 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81353e8-6a78-46f3-ae59-028afb88c5ef" containerName="keystone-bootstrap" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.873442 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81353e8-6a78-46f3-ae59-028afb88c5ef" containerName="keystone-bootstrap" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.874075 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.878092 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.878597 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.878734 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fjbw9" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.879033 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.879287 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.879426 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 17 11:33:00 crc kubenswrapper[4742]: I0317 11:33:00.904392 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cf69c6b9b-d9hmq"] Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.066259 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-public-tls-certs\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.066336 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgqc6\" (UniqueName: \"kubernetes.io/projected/896b4ef2-200c-4981-b22f-d93e9979c130-kube-api-access-pgqc6\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.066395 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-credential-keys\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.066430 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-scripts\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.066468 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-combined-ca-bundle\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.066489 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-fernet-keys\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.066517 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-internal-tls-certs\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.066537 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-config-data\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.168213 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-fernet-keys\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.168289 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-internal-tls-certs\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.168335 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-config-data\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.168366 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-public-tls-certs\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.168407 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgqc6\" (UniqueName: \"kubernetes.io/projected/896b4ef2-200c-4981-b22f-d93e9979c130-kube-api-access-pgqc6\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.168480 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-credential-keys\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.168526 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-scripts\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.168582 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-combined-ca-bundle\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.185574 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-scripts\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.185739 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-public-tls-certs\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.186517 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-internal-tls-certs\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.186552 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-combined-ca-bundle\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.187897 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-config-data\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.188152 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-credential-keys\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.192481 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/896b4ef2-200c-4981-b22f-d93e9979c130-fernet-keys\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.205515 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgqc6\" (UniqueName: \"kubernetes.io/projected/896b4ef2-200c-4981-b22f-d93e9979c130-kube-api-access-pgqc6\") pod \"keystone-cf69c6b9b-d9hmq\" (UID: \"896b4ef2-200c-4981-b22f-d93e9979c130\") " pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.493928 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.495935 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.547649 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5lt24"] Mar 17 11:33:01 crc kubenswrapper[4742]: I0317 11:33:01.547956 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" podUID="3a2ec2cf-0d0e-455c-82af-03ddae3858bd" containerName="dnsmasq-dns" containerID="cri-o://a171d31afe2a8fee197c1e824702384e0fe66f168432f5b13b1587e1abe0d3d0" gracePeriod=10 Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.117692 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.118078 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.124554 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5cbc75d594-mvhf5" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.204936 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.204995 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.212152 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5c4556b444-kq454" podUID="480fea20-eab5-4c68-9bc3-9b218ba0b43d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.344722 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.344958 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.353853 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.353925 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.419322 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.424885 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.437716 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.452487 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.523408 4742 generic.go:334] "Generic (PLEG): container finished" podID="3a2ec2cf-0d0e-455c-82af-03ddae3858bd" containerID="a171d31afe2a8fee197c1e824702384e0fe66f168432f5b13b1587e1abe0d3d0" exitCode=0 Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.524472 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" event={"ID":"3a2ec2cf-0d0e-455c-82af-03ddae3858bd","Type":"ContainerDied","Data":"a171d31afe2a8fee197c1e824702384e0fe66f168432f5b13b1587e1abe0d3d0"} Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.524496 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" event={"ID":"3a2ec2cf-0d0e-455c-82af-03ddae3858bd","Type":"ContainerDied","Data":"5f73ea0aff6744ed1ad9f3c7076712de858887d55d0a7b4932cad52091908a97"} Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.524508 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f73ea0aff6744ed1ad9f3c7076712de858887d55d0a7b4932cad52091908a97" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.524816 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.525014 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.525028 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.525036 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.585065 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cf69c6b9b-d9hmq"] Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.599738 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.703677 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9cb4\" (UniqueName: \"kubernetes.io/projected/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-kube-api-access-c9cb4\") pod \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.704355 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-dns-swift-storage-0\") pod \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.704433 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-config\") pod \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.704487 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-ovsdbserver-sb\") pod \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.704525 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-dns-svc\") pod \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.704550 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-ovsdbserver-nb\") pod \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\" (UID: \"3a2ec2cf-0d0e-455c-82af-03ddae3858bd\") " Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.711739 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-kube-api-access-c9cb4" (OuterVolumeSpecName: "kube-api-access-c9cb4") pod "3a2ec2cf-0d0e-455c-82af-03ddae3858bd" (UID: "3a2ec2cf-0d0e-455c-82af-03ddae3858bd"). InnerVolumeSpecName "kube-api-access-c9cb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.759394 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3a2ec2cf-0d0e-455c-82af-03ddae3858bd" (UID: "3a2ec2cf-0d0e-455c-82af-03ddae3858bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.774757 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a2ec2cf-0d0e-455c-82af-03ddae3858bd" (UID: "3a2ec2cf-0d0e-455c-82af-03ddae3858bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.775698 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3a2ec2cf-0d0e-455c-82af-03ddae3858bd" (UID: "3a2ec2cf-0d0e-455c-82af-03ddae3858bd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.786948 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3a2ec2cf-0d0e-455c-82af-03ddae3858bd" (UID: "3a2ec2cf-0d0e-455c-82af-03ddae3858bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.788694 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-config" (OuterVolumeSpecName: "config") pod "3a2ec2cf-0d0e-455c-82af-03ddae3858bd" (UID: "3a2ec2cf-0d0e-455c-82af-03ddae3858bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.807151 4742 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.807184 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.807194 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.807205 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.807216 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:02 crc kubenswrapper[4742]: I0317 11:33:02.807225 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9cb4\" (UniqueName: \"kubernetes.io/projected/3a2ec2cf-0d0e-455c-82af-03ddae3858bd-kube-api-access-c9cb4\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:03 crc kubenswrapper[4742]: I0317 11:33:03.537147 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cf69c6b9b-d9hmq" event={"ID":"896b4ef2-200c-4981-b22f-d93e9979c130","Type":"ContainerStarted","Data":"da2b01425a480726577c4638bc9dd3314d280e3ac649432948a1f6177567ec8f"} Mar 17 11:33:03 crc kubenswrapper[4742]: I0317 11:33:03.537534 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cf69c6b9b-d9hmq" event={"ID":"896b4ef2-200c-4981-b22f-d93e9979c130","Type":"ContainerStarted","Data":"fdff260484e0bc57fd8dea1e9d793c38494929772fd958ee5c38dc7e0233f2bb"} Mar 17 11:33:03 crc kubenswrapper[4742]: I0317 11:33:03.537567 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:03 crc kubenswrapper[4742]: I0317 11:33:03.541117 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfcf738-372c-42d4-a4b0-c1f88be1dd43","Type":"ContainerStarted","Data":"9c2ca57a86dba60ec3cf999145bdd62804018d8dfa6942ffdba5bf3f1ce1ad9f"} Mar 17 11:33:03 crc kubenswrapper[4742]: I0317 11:33:03.541174 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-5lt24" Mar 17 11:33:03 crc kubenswrapper[4742]: I0317 11:33:03.568368 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cf69c6b9b-d9hmq" podStartSLOduration=3.568352128 podStartE2EDuration="3.568352128s" podCreationTimestamp="2026-03-17 11:33:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:33:03.564710937 +0000 UTC m=+1286.690838715" watchObservedRunningTime="2026-03-17 11:33:03.568352128 +0000 UTC m=+1286.694479886" Mar 17 11:33:03 crc kubenswrapper[4742]: I0317 11:33:03.594623 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5lt24"] Mar 17 11:33:03 crc kubenswrapper[4742]: I0317 11:33:03.602489 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-5lt24"] Mar 17 11:33:04 crc kubenswrapper[4742]: I0317 11:33:04.556427 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cs4pt" event={"ID":"90b52e42-6eca-4585-95a0-057055089c97","Type":"ContainerStarted","Data":"726ccc09aac7a7dfce0c397456d3768ad3833d3e4b1325cf1dfe826d69747455"} Mar 17 11:33:04 crc kubenswrapper[4742]: I0317 11:33:04.556714 4742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 11:33:04 crc kubenswrapper[4742]: I0317 11:33:04.556784 4742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 11:33:04 crc kubenswrapper[4742]: I0317 11:33:04.580180 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-cs4pt" podStartSLOduration=2.015726985 podStartE2EDuration="41.580160462s" podCreationTimestamp="2026-03-17 11:32:23 +0000 UTC" firstStartedPulling="2026-03-17 11:32:24.544632232 +0000 UTC m=+1247.670759990" lastFinishedPulling="2026-03-17 11:33:04.109065709 +0000 UTC m=+1287.235193467" observedRunningTime="2026-03-17 11:33:04.571677016 +0000 UTC m=+1287.697804774" watchObservedRunningTime="2026-03-17 11:33:04.580160462 +0000 UTC m=+1287.706288220" Mar 17 11:33:04 crc kubenswrapper[4742]: I0317 11:33:04.630515 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 17 11:33:04 crc kubenswrapper[4742]: I0317 11:33:04.630603 4742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 11:33:04 crc kubenswrapper[4742]: I0317 11:33:04.631700 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 17 11:33:04 crc kubenswrapper[4742]: I0317 11:33:04.643896 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 17 11:33:04 crc kubenswrapper[4742]: I0317 11:33:04.686811 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a2ec2cf-0d0e-455c-82af-03ddae3858bd" path="/var/lib/kubelet/pods/3a2ec2cf-0d0e-455c-82af-03ddae3858bd/volumes" Mar 17 11:33:04 crc kubenswrapper[4742]: I0317 11:33:04.782663 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 17 11:33:05 crc kubenswrapper[4742]: I0317 11:33:05.568651 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qzc74" event={"ID":"5c75af6d-6842-49b5-aebe-54feb0644942","Type":"ContainerStarted","Data":"a048e1a32911037a0dbd5f184d943a049e436c92cf0945a43419112df79b53df"} Mar 17 11:33:05 crc kubenswrapper[4742]: I0317 11:33:05.614555 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-qzc74" podStartSLOduration=3.063712484 podStartE2EDuration="42.614530603s" podCreationTimestamp="2026-03-17 11:32:23 +0000 UTC" firstStartedPulling="2026-03-17 11:32:24.652957689 +0000 UTC m=+1247.779085447" lastFinishedPulling="2026-03-17 11:33:04.203775818 +0000 UTC m=+1287.329903566" observedRunningTime="2026-03-17 11:33:05.595865633 +0000 UTC m=+1288.721993391" watchObservedRunningTime="2026-03-17 11:33:05.614530603 +0000 UTC m=+1288.740658361" Mar 17 11:33:07 crc kubenswrapper[4742]: I0317 11:33:07.603071 4742 generic.go:334] "Generic (PLEG): container finished" podID="90b52e42-6eca-4585-95a0-057055089c97" containerID="726ccc09aac7a7dfce0c397456d3768ad3833d3e4b1325cf1dfe826d69747455" exitCode=0 Mar 17 11:33:07 crc kubenswrapper[4742]: I0317 11:33:07.603182 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cs4pt" event={"ID":"90b52e42-6eca-4585-95a0-057055089c97","Type":"ContainerDied","Data":"726ccc09aac7a7dfce0c397456d3768ad3833d3e4b1325cf1dfe826d69747455"} Mar 17 11:33:09 crc kubenswrapper[4742]: I0317 11:33:09.632207 4742 generic.go:334] "Generic (PLEG): container finished" podID="5c75af6d-6842-49b5-aebe-54feb0644942" containerID="a048e1a32911037a0dbd5f184d943a049e436c92cf0945a43419112df79b53df" exitCode=0 Mar 17 11:33:09 crc kubenswrapper[4742]: I0317 11:33:09.632395 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qzc74" event={"ID":"5c75af6d-6842-49b5-aebe-54feb0644942","Type":"ContainerDied","Data":"a048e1a32911037a0dbd5f184d943a049e436c92cf0945a43419112df79b53df"} Mar 17 11:33:11 crc kubenswrapper[4742]: I0317 11:33:11.509338 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cs4pt" Mar 17 11:33:11 crc kubenswrapper[4742]: I0317 11:33:11.654763 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cs4pt" event={"ID":"90b52e42-6eca-4585-95a0-057055089c97","Type":"ContainerDied","Data":"f707fa9dcb2b6f1652899279bd1a27da340fd0c855146353fc118656f7e0bd11"} Mar 17 11:33:11 crc kubenswrapper[4742]: I0317 11:33:11.654808 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f707fa9dcb2b6f1652899279bd1a27da340fd0c855146353fc118656f7e0bd11" Mar 17 11:33:11 crc kubenswrapper[4742]: I0317 11:33:11.654884 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cs4pt" Mar 17 11:33:11 crc kubenswrapper[4742]: I0317 11:33:11.678759 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90b52e42-6eca-4585-95a0-057055089c97-db-sync-config-data\") pod \"90b52e42-6eca-4585-95a0-057055089c97\" (UID: \"90b52e42-6eca-4585-95a0-057055089c97\") " Mar 17 11:33:11 crc kubenswrapper[4742]: I0317 11:33:11.678922 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96fnv\" (UniqueName: \"kubernetes.io/projected/90b52e42-6eca-4585-95a0-057055089c97-kube-api-access-96fnv\") pod \"90b52e42-6eca-4585-95a0-057055089c97\" (UID: \"90b52e42-6eca-4585-95a0-057055089c97\") " Mar 17 11:33:11 crc kubenswrapper[4742]: I0317 11:33:11.679421 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b52e42-6eca-4585-95a0-057055089c97-combined-ca-bundle\") pod \"90b52e42-6eca-4585-95a0-057055089c97\" (UID: \"90b52e42-6eca-4585-95a0-057055089c97\") " Mar 17 11:33:11 crc kubenswrapper[4742]: I0317 11:33:11.685592 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b52e42-6eca-4585-95a0-057055089c97-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "90b52e42-6eca-4585-95a0-057055089c97" (UID: "90b52e42-6eca-4585-95a0-057055089c97"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:11 crc kubenswrapper[4742]: I0317 11:33:11.686654 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b52e42-6eca-4585-95a0-057055089c97-kube-api-access-96fnv" (OuterVolumeSpecName: "kube-api-access-96fnv") pod "90b52e42-6eca-4585-95a0-057055089c97" (UID: "90b52e42-6eca-4585-95a0-057055089c97"). InnerVolumeSpecName "kube-api-access-96fnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:11 crc kubenswrapper[4742]: I0317 11:33:11.708995 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b52e42-6eca-4585-95a0-057055089c97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90b52e42-6eca-4585-95a0-057055089c97" (UID: "90b52e42-6eca-4585-95a0-057055089c97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:11 crc kubenswrapper[4742]: I0317 11:33:11.782017 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b52e42-6eca-4585-95a0-057055089c97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:11 crc kubenswrapper[4742]: I0317 11:33:11.782047 4742 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/90b52e42-6eca-4585-95a0-057055089c97-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:11 crc kubenswrapper[4742]: I0317 11:33:11.782056 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96fnv\" (UniqueName: \"kubernetes.io/projected/90b52e42-6eca-4585-95a0-057055089c97-kube-api-access-96fnv\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.118463 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5cbc75d594-mvhf5" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.204470 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5c4556b444-kq454" podUID="480fea20-eab5-4c68-9bc3-9b218ba0b43d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.157:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.157:8443: connect: connection refused" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.813107 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-68ddcd6d89-6jx5j"] Mar 17 11:33:12 crc kubenswrapper[4742]: E0317 11:33:12.813931 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b52e42-6eca-4585-95a0-057055089c97" containerName="barbican-db-sync" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.824957 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b52e42-6eca-4585-95a0-057055089c97" containerName="barbican-db-sync" Mar 17 11:33:12 crc kubenswrapper[4742]: E0317 11:33:12.825011 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2ec2cf-0d0e-455c-82af-03ddae3858bd" containerName="dnsmasq-dns" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.825020 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2ec2cf-0d0e-455c-82af-03ddae3858bd" containerName="dnsmasq-dns" Mar 17 11:33:12 crc kubenswrapper[4742]: E0317 11:33:12.825055 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2ec2cf-0d0e-455c-82af-03ddae3858bd" containerName="init" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.825063 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2ec2cf-0d0e-455c-82af-03ddae3858bd" containerName="init" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.825581 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2ec2cf-0d0e-455c-82af-03ddae3858bd" containerName="dnsmasq-dns" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.825622 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="90b52e42-6eca-4585-95a0-057055089c97" containerName="barbican-db-sync" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.826966 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.829941 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.831057 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.831476 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-dd6kx" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.851793 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-68ddcd6d89-6jx5j"] Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.885121 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-bf65fb77d-664w7"] Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.886807 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.888155 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-bf65fb77d-664w7"] Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.892527 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.908821 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-2s65k"] Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.910219 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:12 crc kubenswrapper[4742]: I0317 11:33:12.936532 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-2s65k"] Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.008072 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.008369 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.008474 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b377427-ca51-4054-9725-545bba6b9319-combined-ca-bundle\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.008553 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b377427-ca51-4054-9725-545bba6b9319-config-data\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.008641 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45jxm\" (UniqueName: \"kubernetes.io/projected/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-kube-api-access-45jxm\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.008721 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ac953fc-7316-4941-920f-8298fd752c3a-config-data-custom\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.008830 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.008928 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-config\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.009004 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.009081 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac953fc-7316-4941-920f-8298fd752c3a-combined-ca-bundle\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.009160 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b377427-ca51-4054-9725-545bba6b9319-logs\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.009235 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z6sz\" (UniqueName: \"kubernetes.io/projected/1b377427-ca51-4054-9725-545bba6b9319-kube-api-access-9z6sz\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.009303 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdnf6\" (UniqueName: \"kubernetes.io/projected/8ac953fc-7316-4941-920f-8298fd752c3a-kube-api-access-mdnf6\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.009376 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b377427-ca51-4054-9725-545bba6b9319-config-data-custom\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.009449 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac953fc-7316-4941-920f-8298fd752c3a-config-data\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.009535 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ac953fc-7316-4941-920f-8298fd752c3a-logs\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.009046 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-76c59bbd5d-b7mv4"] Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.011113 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.017426 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76c59bbd5d-b7mv4"] Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.024889 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.111925 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-config-data\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.112014 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ac953fc-7316-4941-920f-8298fd752c3a-logs\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.112047 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-combined-ca-bundle\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.112126 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.112511 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ac953fc-7316-4941-920f-8298fd752c3a-logs\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.113113 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.113172 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.113203 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b377427-ca51-4054-9725-545bba6b9319-combined-ca-bundle\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.113845 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.113880 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b377427-ca51-4054-9725-545bba6b9319-config-data\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.113920 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45jxm\" (UniqueName: \"kubernetes.io/projected/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-kube-api-access-45jxm\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.113939 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-logs\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.113958 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ac953fc-7316-4941-920f-8298fd752c3a-config-data-custom\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.114020 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.114040 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-config\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.114057 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac953fc-7316-4941-920f-8298fd752c3a-combined-ca-bundle\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.114071 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.114106 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-config-data-custom\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.114132 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b377427-ca51-4054-9725-545bba6b9319-logs\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.114157 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z6sz\" (UniqueName: \"kubernetes.io/projected/1b377427-ca51-4054-9725-545bba6b9319-kube-api-access-9z6sz\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.114174 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdnf6\" (UniqueName: \"kubernetes.io/projected/8ac953fc-7316-4941-920f-8298fd752c3a-kube-api-access-mdnf6\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.114191 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b377427-ca51-4054-9725-545bba6b9319-config-data-custom\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.114207 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac953fc-7316-4941-920f-8298fd752c3a-config-data\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.114224 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d42r\" (UniqueName: \"kubernetes.io/projected/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-kube-api-access-5d42r\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.115252 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.115814 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.116393 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-config\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.116794 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b377427-ca51-4054-9725-545bba6b9319-logs\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.120859 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b377427-ca51-4054-9725-545bba6b9319-config-data\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.124959 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac953fc-7316-4941-920f-8298fd752c3a-config-data\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.126520 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b377427-ca51-4054-9725-545bba6b9319-combined-ca-bundle\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.131569 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac953fc-7316-4941-920f-8298fd752c3a-combined-ca-bundle\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.131980 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45jxm\" (UniqueName: \"kubernetes.io/projected/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-kube-api-access-45jxm\") pod \"dnsmasq-dns-85ff748b95-2s65k\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.132399 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b377427-ca51-4054-9725-545bba6b9319-config-data-custom\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.135447 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z6sz\" (UniqueName: \"kubernetes.io/projected/1b377427-ca51-4054-9725-545bba6b9319-kube-api-access-9z6sz\") pod \"barbican-worker-68ddcd6d89-6jx5j\" (UID: \"1b377427-ca51-4054-9725-545bba6b9319\") " pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.135570 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ac953fc-7316-4941-920f-8298fd752c3a-config-data-custom\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.138419 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdnf6\" (UniqueName: \"kubernetes.io/projected/8ac953fc-7316-4941-920f-8298fd752c3a-kube-api-access-mdnf6\") pod \"barbican-keystone-listener-bf65fb77d-664w7\" (UID: \"8ac953fc-7316-4941-920f-8298fd752c3a\") " pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.175454 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-68ddcd6d89-6jx5j" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.217106 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-config-data-custom\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.217274 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d42r\" (UniqueName: \"kubernetes.io/projected/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-kube-api-access-5d42r\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.217391 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-config-data\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.217505 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-combined-ca-bundle\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.217648 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-logs\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.218184 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-logs\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.221248 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-config-data-custom\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.225291 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.226225 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-combined-ca-bundle\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.230782 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-config-data\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.234415 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d42r\" (UniqueName: \"kubernetes.io/projected/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-kube-api-access-5d42r\") pod \"barbican-api-76c59bbd5d-b7mv4\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.249687 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.351296 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.384219 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qzc74" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.527500 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-config-data\") pod \"5c75af6d-6842-49b5-aebe-54feb0644942\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.527819 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-scripts\") pod \"5c75af6d-6842-49b5-aebe-54feb0644942\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.528068 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-db-sync-config-data\") pod \"5c75af6d-6842-49b5-aebe-54feb0644942\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.528476 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c75af6d-6842-49b5-aebe-54feb0644942-etc-machine-id\") pod \"5c75af6d-6842-49b5-aebe-54feb0644942\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.528506 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-combined-ca-bundle\") pod \"5c75af6d-6842-49b5-aebe-54feb0644942\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.528550 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxtkz\" (UniqueName: \"kubernetes.io/projected/5c75af6d-6842-49b5-aebe-54feb0644942-kube-api-access-fxtkz\") pod \"5c75af6d-6842-49b5-aebe-54feb0644942\" (UID: \"5c75af6d-6842-49b5-aebe-54feb0644942\") " Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.530976 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c75af6d-6842-49b5-aebe-54feb0644942-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5c75af6d-6842-49b5-aebe-54feb0644942" (UID: "5c75af6d-6842-49b5-aebe-54feb0644942"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.534098 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c75af6d-6842-49b5-aebe-54feb0644942-kube-api-access-fxtkz" (OuterVolumeSpecName: "kube-api-access-fxtkz") pod "5c75af6d-6842-49b5-aebe-54feb0644942" (UID: "5c75af6d-6842-49b5-aebe-54feb0644942"). InnerVolumeSpecName "kube-api-access-fxtkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.535476 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5c75af6d-6842-49b5-aebe-54feb0644942" (UID: "5c75af6d-6842-49b5-aebe-54feb0644942"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.538224 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-scripts" (OuterVolumeSpecName: "scripts") pod "5c75af6d-6842-49b5-aebe-54feb0644942" (UID: "5c75af6d-6842-49b5-aebe-54feb0644942"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.573158 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-config-data" (OuterVolumeSpecName: "config-data") pod "5c75af6d-6842-49b5-aebe-54feb0644942" (UID: "5c75af6d-6842-49b5-aebe-54feb0644942"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.575831 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c75af6d-6842-49b5-aebe-54feb0644942" (UID: "5c75af6d-6842-49b5-aebe-54feb0644942"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.634949 4742 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c75af6d-6842-49b5-aebe-54feb0644942-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.635218 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.635229 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxtkz\" (UniqueName: \"kubernetes.io/projected/5c75af6d-6842-49b5-aebe-54feb0644942-kube-api-access-fxtkz\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.635242 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.635250 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.635259 4742 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5c75af6d-6842-49b5-aebe-54feb0644942-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.677460 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qzc74" event={"ID":"5c75af6d-6842-49b5-aebe-54feb0644942","Type":"ContainerDied","Data":"6b4fe8e438beafa20a8e2a80310602c0b0abcceb25ccdd88cdb82163d66d9be7"} Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.677500 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b4fe8e438beafa20a8e2a80310602c0b0abcceb25ccdd88cdb82163d66d9be7" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.677596 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qzc74" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.686444 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfcf738-372c-42d4-a4b0-c1f88be1dd43","Type":"ContainerStarted","Data":"b239448065ceacded0f78151981c090e15baccbedb984760490fbdb65f3183ef"} Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.686794 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="ceilometer-central-agent" containerID="cri-o://623151fda616fa23271be1c022ca456476faf88a7e67839a6966fde857b2fa0d" gracePeriod=30 Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.687107 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.687445 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="proxy-httpd" containerID="cri-o://b239448065ceacded0f78151981c090e15baccbedb984760490fbdb65f3183ef" gracePeriod=30 Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.687530 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="sg-core" containerID="cri-o://9c2ca57a86dba60ec3cf999145bdd62804018d8dfa6942ffdba5bf3f1ce1ad9f" gracePeriod=30 Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.687593 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="ceilometer-notification-agent" containerID="cri-o://28894944f3e0b9ab2c2069b968405fd3eb1d38909532b97ffb561b4777f86dbf" gracePeriod=30 Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.709131 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.03747701 podStartE2EDuration="51.709115622s" podCreationTimestamp="2026-03-17 11:32:22 +0000 UTC" firstStartedPulling="2026-03-17 11:32:24.544351284 +0000 UTC m=+1247.670479042" lastFinishedPulling="2026-03-17 11:33:13.215989896 +0000 UTC m=+1296.342117654" observedRunningTime="2026-03-17 11:33:13.708802573 +0000 UTC m=+1296.834930331" watchObservedRunningTime="2026-03-17 11:33:13.709115622 +0000 UTC m=+1296.835243380" Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.775065 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-bf65fb77d-664w7"] Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.900090 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-68ddcd6d89-6jx5j"] Mar 17 11:33:13 crc kubenswrapper[4742]: I0317 11:33:13.910642 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-2s65k"] Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.062043 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76c59bbd5d-b7mv4"] Mar 17 11:33:14 crc kubenswrapper[4742]: W0317 11:33:14.064440 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38dc6520_ca66_44cf_bd2d_6d65bf57ff3a.slice/crio-c43fed673a14414c2d07fee00ae8ee07cea5c016a3751834743d19f4e301e6aa WatchSource:0}: Error finding container c43fed673a14414c2d07fee00ae8ee07cea5c016a3751834743d19f4e301e6aa: Status 404 returned error can't find the container with id c43fed673a14414c2d07fee00ae8ee07cea5c016a3751834743d19f4e301e6aa Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.733887 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68ddcd6d89-6jx5j" event={"ID":"1b377427-ca51-4054-9725-545bba6b9319","Type":"ContainerStarted","Data":"e8b5bd9854afa5e5d4dfb1557eebc3c13b59d5bdad41445f4ee82cf39f8d6bc8"} Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.753444 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 11:33:14 crc kubenswrapper[4742]: E0317 11:33:14.754164 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c75af6d-6842-49b5-aebe-54feb0644942" containerName="cinder-db-sync" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.754256 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c75af6d-6842-49b5-aebe-54feb0644942" containerName="cinder-db-sync" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.754528 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c75af6d-6842-49b5-aebe-54feb0644942" containerName="cinder-db-sync" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.767259 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.769759 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mknj4" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.770292 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-config-data\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.770331 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86924589-f9f6-432e-aa21-0361bbe86f06-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.770395 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-scripts\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.770415 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgtjm\" (UniqueName: \"kubernetes.io/projected/86924589-f9f6-432e-aa21-0361bbe86f06-kube-api-access-cgtjm\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.770435 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.770472 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.771759 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.772183 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c59bbd5d-b7mv4" event={"ID":"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a","Type":"ContainerStarted","Data":"f98f9505566dbe4a3a4f85287f648f6e3592ea6be5687ab836fe65a33e535a24"} Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.772232 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c59bbd5d-b7mv4" event={"ID":"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a","Type":"ContainerStarted","Data":"b3f970fd22311f013fc289b3df07b5f78e7a4f99a5ffb24d66df51f98fa31fd2"} Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.772244 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c59bbd5d-b7mv4" event={"ID":"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a","Type":"ContainerStarted","Data":"c43fed673a14414c2d07fee00ae8ee07cea5c016a3751834743d19f4e301e6aa"} Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.772393 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.781068 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.781415 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.781723 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.827537 4742 generic.go:334] "Generic (PLEG): container finished" podID="fcefa52f-32da-426e-afa4-f2bf3dfa8cc5" containerID="e80764ab8394ca5e9b4fb61434883fb861a6c10d3db422d8aa92d92c0ab9f1a9" exitCode=0 Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.827671 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-2s65k" event={"ID":"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5","Type":"ContainerDied","Data":"e80764ab8394ca5e9b4fb61434883fb861a6c10d3db422d8aa92d92c0ab9f1a9"} Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.827697 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-2s65k" event={"ID":"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5","Type":"ContainerStarted","Data":"f2c7e5ec9357461a55a40bc19e2dbee0e6535ce6076a9a0c46d2dca1c24dd726"} Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.875262 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.875305 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" event={"ID":"8ac953fc-7316-4941-920f-8298fd752c3a","Type":"ContainerStarted","Data":"a1edec616f6215189a3a6dc6cc1d42cc6aa1ad7cf07350967969a8cfb3e6fbbf"} Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.918890 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-scripts\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.918993 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgtjm\" (UniqueName: \"kubernetes.io/projected/86924589-f9f6-432e-aa21-0361bbe86f06-kube-api-access-cgtjm\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.919055 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.919126 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.919255 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-config-data\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.919328 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86924589-f9f6-432e-aa21-0361bbe86f06-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.919437 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86924589-f9f6-432e-aa21-0361bbe86f06-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.947351 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfcf738-372c-42d4-a4b0-c1f88be1dd43","Type":"ContainerDied","Data":"b239448065ceacded0f78151981c090e15baccbedb984760490fbdb65f3183ef"} Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.947626 4742 generic.go:334] "Generic (PLEG): container finished" podID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerID="b239448065ceacded0f78151981c090e15baccbedb984760490fbdb65f3183ef" exitCode=0 Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.947790 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfcf738-372c-42d4-a4b0-c1f88be1dd43","Type":"ContainerDied","Data":"9c2ca57a86dba60ec3cf999145bdd62804018d8dfa6942ffdba5bf3f1ce1ad9f"} Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.947874 4742 generic.go:334] "Generic (PLEG): container finished" podID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerID="9c2ca57a86dba60ec3cf999145bdd62804018d8dfa6942ffdba5bf3f1ce1ad9f" exitCode=2 Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.947953 4742 generic.go:334] "Generic (PLEG): container finished" podID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerID="623151fda616fa23271be1c022ca456476faf88a7e67839a6966fde857b2fa0d" exitCode=0 Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.948040 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfcf738-372c-42d4-a4b0-c1f88be1dd43","Type":"ContainerDied","Data":"623151fda616fa23271be1c022ca456476faf88a7e67839a6966fde857b2fa0d"} Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.950632 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-scripts\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.956891 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-config-data\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.961495 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.973646 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.977231 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-2s65k"] Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.977829 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-76c59bbd5d-b7mv4" podStartSLOduration=2.977819711 podStartE2EDuration="2.977819711s" podCreationTimestamp="2026-03-17 11:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:33:14.9156886 +0000 UTC m=+1298.041816358" watchObservedRunningTime="2026-03-17 11:33:14.977819711 +0000 UTC m=+1298.103947459" Mar 17 11:33:14 crc kubenswrapper[4742]: I0317 11:33:14.992517 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgtjm\" (UniqueName: \"kubernetes.io/projected/86924589-f9f6-432e-aa21-0361bbe86f06-kube-api-access-cgtjm\") pod \"cinder-scheduler-0\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.064960 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-ljkt4"] Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.066614 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.073276 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-ljkt4"] Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.087979 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.089681 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.093316 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.099010 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.122864 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bxkc\" (UniqueName: \"kubernetes.io/projected/5948988b-0036-4d62-9511-23a900c10b83-kube-api-access-9bxkc\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.122963 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.123002 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.123035 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-config\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.123067 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.123087 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.132149 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.227477 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.227849 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd851dc-11b1-43f8-9853-9209a5d656c0-logs\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.227873 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.227900 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.227943 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-config\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.227965 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cd851dc-11b1-43f8-9853-9209a5d656c0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.227987 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.228005 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.228041 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9cdq\" (UniqueName: \"kubernetes.io/projected/6cd851dc-11b1-43f8-9853-9209a5d656c0-kube-api-access-b9cdq\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.228060 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bxkc\" (UniqueName: \"kubernetes.io/projected/5948988b-0036-4d62-9511-23a900c10b83-kube-api-access-9bxkc\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.228092 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-config-data-custom\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.228126 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-config-data\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.228146 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-scripts\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.228881 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.229427 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.229922 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-config\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.230435 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.230986 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.283960 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bxkc\" (UniqueName: \"kubernetes.io/projected/5948988b-0036-4d62-9511-23a900c10b83-kube-api-access-9bxkc\") pod \"dnsmasq-dns-5c9776ccc5-ljkt4\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.333333 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9cdq\" (UniqueName: \"kubernetes.io/projected/6cd851dc-11b1-43f8-9853-9209a5d656c0-kube-api-access-b9cdq\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.333436 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-config-data-custom\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.333518 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-config-data\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.333588 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-scripts\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.333675 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd851dc-11b1-43f8-9853-9209a5d656c0-logs\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.333724 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.333791 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cd851dc-11b1-43f8-9853-9209a5d656c0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.333960 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cd851dc-11b1-43f8-9853-9209a5d656c0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.342326 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd851dc-11b1-43f8-9853-9209a5d656c0-logs\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.374695 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-config-data-custom\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.383351 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-scripts\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.383665 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9cdq\" (UniqueName: \"kubernetes.io/projected/6cd851dc-11b1-43f8-9853-9209a5d656c0-kube-api-access-b9cdq\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.391748 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.401699 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-config-data\") pod \"cinder-api-0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.407140 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.624510 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 11:33:15 crc kubenswrapper[4742]: I0317 11:33:15.709483 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 11:33:15 crc kubenswrapper[4742]: E0317 11:33:15.825327 4742 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 17 11:33:15 crc kubenswrapper[4742]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 17 11:33:15 crc kubenswrapper[4742]: > podSandboxID="f2c7e5ec9357461a55a40bc19e2dbee0e6535ce6076a9a0c46d2dca1c24dd726" Mar 17 11:33:15 crc kubenswrapper[4742]: E0317 11:33:15.825485 4742 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 17 11:33:15 crc kubenswrapper[4742]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7ch57ch5c5hcch589hf7h577h659h96h5c8h5b4h55fhbbh667h565h5bchcbh58dh7dh5bch586h56ch574h598h67dh5c8h56dh8bh574h564hbch7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-45jxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-85ff748b95-2s65k_openstack(fcefa52f-32da-426e-afa4-f2bf3dfa8cc5): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 17 11:33:15 crc kubenswrapper[4742]: > logger="UnhandledError" Mar 17 11:33:15 crc kubenswrapper[4742]: E0317 11:33:15.827460 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-85ff748b95-2s65k" podUID="fcefa52f-32da-426e-afa4-f2bf3dfa8cc5" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.457627 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.554544 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-ovsdbserver-nb\") pod \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.555036 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-config\") pod \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.555116 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-dns-swift-storage-0\") pod \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.555145 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-dns-svc\") pod \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.555193 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45jxm\" (UniqueName: \"kubernetes.io/projected/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-kube-api-access-45jxm\") pod \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.555260 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-ovsdbserver-sb\") pod \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\" (UID: \"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5\") " Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.564473 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-kube-api-access-45jxm" (OuterVolumeSpecName: "kube-api-access-45jxm") pod "fcefa52f-32da-426e-afa4-f2bf3dfa8cc5" (UID: "fcefa52f-32da-426e-afa4-f2bf3dfa8cc5"). InnerVolumeSpecName "kube-api-access-45jxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.627475 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fcefa52f-32da-426e-afa4-f2bf3dfa8cc5" (UID: "fcefa52f-32da-426e-afa4-f2bf3dfa8cc5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.645704 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fcefa52f-32da-426e-afa4-f2bf3dfa8cc5" (UID: "fcefa52f-32da-426e-afa4-f2bf3dfa8cc5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.649051 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-config" (OuterVolumeSpecName: "config") pod "fcefa52f-32da-426e-afa4-f2bf3dfa8cc5" (UID: "fcefa52f-32da-426e-afa4-f2bf3dfa8cc5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.657556 4742 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.657596 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45jxm\" (UniqueName: \"kubernetes.io/projected/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-kube-api-access-45jxm\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.657611 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.657623 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.667711 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fcefa52f-32da-426e-afa4-f2bf3dfa8cc5" (UID: "fcefa52f-32da-426e-afa4-f2bf3dfa8cc5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.668950 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fcefa52f-32da-426e-afa4-f2bf3dfa8cc5" (UID: "fcefa52f-32da-426e-afa4-f2bf3dfa8cc5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.768334 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.768372 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:16 crc kubenswrapper[4742]: W0317 11:33:16.829422 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5948988b_0036_4d62_9511_23a900c10b83.slice/crio-ea252bee8883e8bcbbbd22f6c1cc990375cbd39f55e5e9c1b2abb76c8ac79587 WatchSource:0}: Error finding container ea252bee8883e8bcbbbd22f6c1cc990375cbd39f55e5e9c1b2abb76c8ac79587: Status 404 returned error can't find the container with id ea252bee8883e8bcbbbd22f6c1cc990375cbd39f55e5e9c1b2abb76c8ac79587 Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.830113 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-ljkt4"] Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.902961 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.971744 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68ddcd6d89-6jx5j" event={"ID":"1b377427-ca51-4054-9725-545bba6b9319","Type":"ContainerStarted","Data":"35e821eb2d125feca39161e967c3c30b99bcf97b50cf29e7caddbfd9890f4ce9"} Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.976027 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" event={"ID":"5948988b-0036-4d62-9511-23a900c10b83","Type":"ContainerStarted","Data":"ea252bee8883e8bcbbbd22f6c1cc990375cbd39f55e5e9c1b2abb76c8ac79587"} Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.977670 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6cd851dc-11b1-43f8-9853-9209a5d656c0","Type":"ContainerStarted","Data":"7d9c814ee5ba1ff69c373a1867acf488d05befe08685e604bac23cc8526eb80a"} Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.980218 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-2s65k" event={"ID":"fcefa52f-32da-426e-afa4-f2bf3dfa8cc5","Type":"ContainerDied","Data":"f2c7e5ec9357461a55a40bc19e2dbee0e6535ce6076a9a0c46d2dca1c24dd726"} Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.980224 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-2s65k" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.980261 4742 scope.go:117] "RemoveContainer" containerID="e80764ab8394ca5e9b4fb61434883fb861a6c10d3db422d8aa92d92c0ab9f1a9" Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.984744 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" event={"ID":"8ac953fc-7316-4941-920f-8298fd752c3a","Type":"ContainerStarted","Data":"fc4620c0a6ead09503ff069762743263a729c9e57affc35131ee6445db985ff4"} Mar 17 11:33:16 crc kubenswrapper[4742]: I0317 11:33:16.986261 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86924589-f9f6-432e-aa21-0361bbe86f06","Type":"ContainerStarted","Data":"0ec897e190c6e6a3522e16c7367cb3e55a327c19a86f7fc40021001955737c14"} Mar 17 11:33:17 crc kubenswrapper[4742]: I0317 11:33:17.029031 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-2s65k"] Mar 17 11:33:17 crc kubenswrapper[4742]: I0317 11:33:17.055875 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-2s65k"] Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.012756 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.039838 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-68ddcd6d89-6jx5j" event={"ID":"1b377427-ca51-4054-9725-545bba6b9319","Type":"ContainerStarted","Data":"f8a58f4b499e2fc82cb3c6f5aa0777a2faeca21906a7958fc6b6909f19edbdcd"} Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.050835 4742 generic.go:334] "Generic (PLEG): container finished" podID="5948988b-0036-4d62-9511-23a900c10b83" containerID="3c98d1b47193f72683a76ca8146296b6516a60dafc118d5658643e658d4a17fe" exitCode=0 Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.050890 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" event={"ID":"5948988b-0036-4d62-9511-23a900c10b83","Type":"ContainerDied","Data":"3c98d1b47193f72683a76ca8146296b6516a60dafc118d5658643e658d4a17fe"} Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.067936 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6cd851dc-11b1-43f8-9853-9209a5d656c0","Type":"ContainerStarted","Data":"58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804"} Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.073929 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-68ddcd6d89-6jx5j" podStartSLOduration=3.635470999 podStartE2EDuration="6.073894349s" podCreationTimestamp="2026-03-17 11:33:12 +0000 UTC" firstStartedPulling="2026-03-17 11:33:13.900816432 +0000 UTC m=+1297.026944190" lastFinishedPulling="2026-03-17 11:33:16.339239762 +0000 UTC m=+1299.465367540" observedRunningTime="2026-03-17 11:33:18.06386772 +0000 UTC m=+1301.189995478" watchObservedRunningTime="2026-03-17 11:33:18.073894349 +0000 UTC m=+1301.200022107" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.094140 4742 generic.go:334] "Generic (PLEG): container finished" podID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerID="28894944f3e0b9ab2c2069b968405fd3eb1d38909532b97ffb561b4777f86dbf" exitCode=0 Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.094206 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfcf738-372c-42d4-a4b0-c1f88be1dd43","Type":"ContainerDied","Data":"28894944f3e0b9ab2c2069b968405fd3eb1d38909532b97ffb561b4777f86dbf"} Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.103367 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" event={"ID":"8ac953fc-7316-4941-920f-8298fd752c3a","Type":"ContainerStarted","Data":"7d3a4a3815aff759573ce122b11c9692bfd85693e062351bd5a12188317cf54b"} Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.125078 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-bf65fb77d-664w7" podStartSLOduration=3.5679913389999998 podStartE2EDuration="6.125060165s" podCreationTimestamp="2026-03-17 11:33:12 +0000 UTC" firstStartedPulling="2026-03-17 11:33:13.7844421 +0000 UTC m=+1296.910569858" lastFinishedPulling="2026-03-17 11:33:16.341510916 +0000 UTC m=+1299.467638684" observedRunningTime="2026-03-17 11:33:18.118380649 +0000 UTC m=+1301.244508407" watchObservedRunningTime="2026-03-17 11:33:18.125060165 +0000 UTC m=+1301.251187923" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.340033 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.413181 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-sg-core-conf-yaml\") pod \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.413266 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-combined-ca-bundle\") pod \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.413319 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-run-httpd\") pod \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.413354 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-scripts\") pod \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.413701 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ecfcf738-372c-42d4-a4b0-c1f88be1dd43" (UID: "ecfcf738-372c-42d4-a4b0-c1f88be1dd43"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.413771 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-log-httpd\") pod \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.413822 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-config-data\") pod \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.413965 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd8tp\" (UniqueName: \"kubernetes.io/projected/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-kube-api-access-kd8tp\") pod \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\" (UID: \"ecfcf738-372c-42d4-a4b0-c1f88be1dd43\") " Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.414280 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ecfcf738-372c-42d4-a4b0-c1f88be1dd43" (UID: "ecfcf738-372c-42d4-a4b0-c1f88be1dd43"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.414969 4742 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.414996 4742 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.424971 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-kube-api-access-kd8tp" (OuterVolumeSpecName: "kube-api-access-kd8tp") pod "ecfcf738-372c-42d4-a4b0-c1f88be1dd43" (UID: "ecfcf738-372c-42d4-a4b0-c1f88be1dd43"). InnerVolumeSpecName "kube-api-access-kd8tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.429936 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-scripts" (OuterVolumeSpecName: "scripts") pod "ecfcf738-372c-42d4-a4b0-c1f88be1dd43" (UID: "ecfcf738-372c-42d4-a4b0-c1f88be1dd43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.488037 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ecfcf738-372c-42d4-a4b0-c1f88be1dd43" (UID: "ecfcf738-372c-42d4-a4b0-c1f88be1dd43"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.516816 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.516846 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd8tp\" (UniqueName: \"kubernetes.io/projected/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-kube-api-access-kd8tp\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.516857 4742 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.531296 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecfcf738-372c-42d4-a4b0-c1f88be1dd43" (UID: "ecfcf738-372c-42d4-a4b0-c1f88be1dd43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.560400 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-config-data" (OuterVolumeSpecName: "config-data") pod "ecfcf738-372c-42d4-a4b0-c1f88be1dd43" (UID: "ecfcf738-372c-42d4-a4b0-c1f88be1dd43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.618142 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.618172 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecfcf738-372c-42d4-a4b0-c1f88be1dd43-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:18 crc kubenswrapper[4742]: I0317 11:33:18.680583 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcefa52f-32da-426e-afa4-f2bf3dfa8cc5" path="/var/lib/kubelet/pods/fcefa52f-32da-426e-afa4-f2bf3dfa8cc5/volumes" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.125634 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" event={"ID":"5948988b-0036-4d62-9511-23a900c10b83","Type":"ContainerStarted","Data":"65cd1ca5aa87160533fcaec3928dfcfce67c009df9274d2dd03654bff8b066a6"} Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.126123 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.128106 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6cd851dc-11b1-43f8-9853-9209a5d656c0","Type":"ContainerStarted","Data":"4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad"} Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.128233 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6cd851dc-11b1-43f8-9853-9209a5d656c0" containerName="cinder-api-log" containerID="cri-o://58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804" gracePeriod=30 Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.128307 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.128336 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6cd851dc-11b1-43f8-9853-9209a5d656c0" containerName="cinder-api" containerID="cri-o://4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad" gracePeriod=30 Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.155283 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ecfcf738-372c-42d4-a4b0-c1f88be1dd43","Type":"ContainerDied","Data":"53deb4fa8067e4c621e16c5328ef6180397f859dc31bf31e84cbd9ddbd1ab926"} Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.155351 4742 scope.go:117] "RemoveContainer" containerID="b239448065ceacded0f78151981c090e15baccbedb984760490fbdb65f3183ef" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.155562 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.161298 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" podStartSLOduration=5.161277488 podStartE2EDuration="5.161277488s" podCreationTimestamp="2026-03-17 11:33:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:33:19.157561664 +0000 UTC m=+1302.283689462" watchObservedRunningTime="2026-03-17 11:33:19.161277488 +0000 UTC m=+1302.287405276" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.168428 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86924589-f9f6-432e-aa21-0361bbe86f06","Type":"ContainerStarted","Data":"2cd403c6657b46f609799733644e3b131de0e1ebcbc9251385f29c6bb26d3dc5"} Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.196095 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.196071027 podStartE2EDuration="4.196071027s" podCreationTimestamp="2026-03-17 11:33:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:33:19.179990189 +0000 UTC m=+1302.306117967" watchObservedRunningTime="2026-03-17 11:33:19.196071027 +0000 UTC m=+1302.322198795" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.246252 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.261443 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.279420 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:19 crc kubenswrapper[4742]: E0317 11:33:19.279807 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="proxy-httpd" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.279824 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="proxy-httpd" Mar 17 11:33:19 crc kubenswrapper[4742]: E0317 11:33:19.279835 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="ceilometer-notification-agent" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.279841 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="ceilometer-notification-agent" Mar 17 11:33:19 crc kubenswrapper[4742]: E0317 11:33:19.279885 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="sg-core" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.279892 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="sg-core" Mar 17 11:33:19 crc kubenswrapper[4742]: E0317 11:33:19.280786 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="ceilometer-central-agent" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.280805 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="ceilometer-central-agent" Mar 17 11:33:19 crc kubenswrapper[4742]: E0317 11:33:19.280816 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcefa52f-32da-426e-afa4-f2bf3dfa8cc5" containerName="init" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.280825 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcefa52f-32da-426e-afa4-f2bf3dfa8cc5" containerName="init" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.280949 4742 scope.go:117] "RemoveContainer" containerID="9c2ca57a86dba60ec3cf999145bdd62804018d8dfa6942ffdba5bf3f1ce1ad9f" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.290077 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcefa52f-32da-426e-afa4-f2bf3dfa8cc5" containerName="init" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.290148 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="sg-core" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.290158 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="ceilometer-central-agent" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.290174 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="proxy-httpd" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.290189 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" containerName="ceilometer-notification-agent" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.291882 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.295408 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.296189 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.317361 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.336808 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc51fde3-3dea-4f98-ac76-18d3a410fab7-log-httpd\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.336853 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-scripts\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.336922 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.336939 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc51fde3-3dea-4f98-ac76-18d3a410fab7-run-httpd\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.336959 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-config-data\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.336981 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p47m\" (UniqueName: \"kubernetes.io/projected/cc51fde3-3dea-4f98-ac76-18d3a410fab7-kube-api-access-2p47m\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.337075 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.346129 4742 scope.go:117] "RemoveContainer" containerID="28894944f3e0b9ab2c2069b968405fd3eb1d38909532b97ffb561b4777f86dbf" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.438696 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc51fde3-3dea-4f98-ac76-18d3a410fab7-log-httpd\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.438762 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-scripts\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.438835 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.438860 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc51fde3-3dea-4f98-ac76-18d3a410fab7-run-httpd\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.438889 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-config-data\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.438935 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p47m\" (UniqueName: \"kubernetes.io/projected/cc51fde3-3dea-4f98-ac76-18d3a410fab7-kube-api-access-2p47m\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.439019 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.440129 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc51fde3-3dea-4f98-ac76-18d3a410fab7-run-httpd\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.441887 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc51fde3-3dea-4f98-ac76-18d3a410fab7-log-httpd\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.443285 4742 scope.go:117] "RemoveContainer" containerID="623151fda616fa23271be1c022ca456476faf88a7e67839a6966fde857b2fa0d" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.445112 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-config-data\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.447487 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.448362 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-scripts\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.453939 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.462730 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p47m\" (UniqueName: \"kubernetes.io/projected/cc51fde3-3dea-4f98-ac76-18d3a410fab7-kube-api-access-2p47m\") pod \"ceilometer-0\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.564880 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.667279 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f76787fd-cvxz9"] Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.692113 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.696470 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.711544 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.733609 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f76787fd-cvxz9"] Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.754501 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-internal-tls-certs\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.754571 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-public-tls-certs\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.754595 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-config-data\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.754630 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx5fh\" (UniqueName: \"kubernetes.io/projected/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-kube-api-access-rx5fh\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.754824 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-config-data-custom\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.754854 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-combined-ca-bundle\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.754887 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-logs\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.875971 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-internal-tls-certs\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.876275 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-public-tls-certs\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.876301 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-config-data\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.876328 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx5fh\" (UniqueName: \"kubernetes.io/projected/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-kube-api-access-rx5fh\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.876371 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-config-data-custom\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.876396 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-combined-ca-bundle\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.876426 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-logs\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.879397 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-logs\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.884952 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-config-data\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.886464 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-combined-ca-bundle\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.889564 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-internal-tls-certs\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.890846 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-public-tls-certs\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.896981 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-config-data-custom\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:19 crc kubenswrapper[4742]: I0317 11:33:19.929990 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx5fh\" (UniqueName: \"kubernetes.io/projected/f550d045-d552-4ea9-b5c8-a4e7d9ff29a1-kube-api-access-rx5fh\") pod \"barbican-api-6f76787fd-cvxz9\" (UID: \"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1\") " pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.007396 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.078719 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-scripts\") pod \"6cd851dc-11b1-43f8-9853-9209a5d656c0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.078762 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-config-data\") pod \"6cd851dc-11b1-43f8-9853-9209a5d656c0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.078961 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-combined-ca-bundle\") pod \"6cd851dc-11b1-43f8-9853-9209a5d656c0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.078986 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd851dc-11b1-43f8-9853-9209a5d656c0-logs\") pod \"6cd851dc-11b1-43f8-9853-9209a5d656c0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.079023 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cd851dc-11b1-43f8-9853-9209a5d656c0-etc-machine-id\") pod \"6cd851dc-11b1-43f8-9853-9209a5d656c0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.079063 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-config-data-custom\") pod \"6cd851dc-11b1-43f8-9853-9209a5d656c0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.079130 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9cdq\" (UniqueName: \"kubernetes.io/projected/6cd851dc-11b1-43f8-9853-9209a5d656c0-kube-api-access-b9cdq\") pod \"6cd851dc-11b1-43f8-9853-9209a5d656c0\" (UID: \"6cd851dc-11b1-43f8-9853-9209a5d656c0\") " Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.080198 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6cd851dc-11b1-43f8-9853-9209a5d656c0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6cd851dc-11b1-43f8-9853-9209a5d656c0" (UID: "6cd851dc-11b1-43f8-9853-9209a5d656c0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.080578 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cd851dc-11b1-43f8-9853-9209a5d656c0-logs" (OuterVolumeSpecName: "logs") pod "6cd851dc-11b1-43f8-9853-9209a5d656c0" (UID: "6cd851dc-11b1-43f8-9853-9209a5d656c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.084147 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6cd851dc-11b1-43f8-9853-9209a5d656c0" (UID: "6cd851dc-11b1-43f8-9853-9209a5d656c0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.085039 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-scripts" (OuterVolumeSpecName: "scripts") pod "6cd851dc-11b1-43f8-9853-9209a5d656c0" (UID: "6cd851dc-11b1-43f8-9853-9209a5d656c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.085436 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd851dc-11b1-43f8-9853-9209a5d656c0-kube-api-access-b9cdq" (OuterVolumeSpecName: "kube-api-access-b9cdq") pod "6cd851dc-11b1-43f8-9853-9209a5d656c0" (UID: "6cd851dc-11b1-43f8-9853-9209a5d656c0"). InnerVolumeSpecName "kube-api-access-b9cdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.087245 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.112097 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cd851dc-11b1-43f8-9853-9209a5d656c0" (UID: "6cd851dc-11b1-43f8-9853-9209a5d656c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.148923 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-config-data" (OuterVolumeSpecName: "config-data") pod "6cd851dc-11b1-43f8-9853-9209a5d656c0" (UID: "6cd851dc-11b1-43f8-9853-9209a5d656c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.181128 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.181161 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.181170 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.181184 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cd851dc-11b1-43f8-9853-9209a5d656c0-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.181193 4742 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cd851dc-11b1-43f8-9853-9209a5d656c0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.181200 4742 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cd851dc-11b1-43f8-9853-9209a5d656c0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.181209 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9cdq\" (UniqueName: \"kubernetes.io/projected/6cd851dc-11b1-43f8-9853-9209a5d656c0-kube-api-access-b9cdq\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.222596 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86924589-f9f6-432e-aa21-0361bbe86f06","Type":"ContainerStarted","Data":"e4ffd56f3aa6ed866dfeb45b882d14c6fe20822296aa78b5b7b92c44a0fb9b5c"} Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.238611 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.239658 4742 generic.go:334] "Generic (PLEG): container finished" podID="6cd851dc-11b1-43f8-9853-9209a5d656c0" containerID="4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad" exitCode=0 Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.239678 4742 generic.go:334] "Generic (PLEG): container finished" podID="6cd851dc-11b1-43f8-9853-9209a5d656c0" containerID="58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804" exitCode=143 Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.239738 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.239788 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6cd851dc-11b1-43f8-9853-9209a5d656c0","Type":"ContainerDied","Data":"4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad"} Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.239809 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6cd851dc-11b1-43f8-9853-9209a5d656c0","Type":"ContainerDied","Data":"58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804"} Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.239818 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6cd851dc-11b1-43f8-9853-9209a5d656c0","Type":"ContainerDied","Data":"7d9c814ee5ba1ff69c373a1867acf488d05befe08685e604bac23cc8526eb80a"} Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.239834 4742 scope.go:117] "RemoveContainer" containerID="4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.259539 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.393249074 podStartE2EDuration="6.259519888s" podCreationTimestamp="2026-03-17 11:33:14 +0000 UTC" firstStartedPulling="2026-03-17 11:33:16.261129156 +0000 UTC m=+1299.387256914" lastFinishedPulling="2026-03-17 11:33:18.12739997 +0000 UTC m=+1301.253527728" observedRunningTime="2026-03-17 11:33:20.259336113 +0000 UTC m=+1303.385463881" watchObservedRunningTime="2026-03-17 11:33:20.259519888 +0000 UTC m=+1303.385647646" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.305331 4742 scope.go:117] "RemoveContainer" containerID="58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.331106 4742 scope.go:117] "RemoveContainer" containerID="4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.334422 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 17 11:33:20 crc kubenswrapper[4742]: E0317 11:33:20.334592 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad\": container with ID starting with 4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad not found: ID does not exist" containerID="4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.334655 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad"} err="failed to get container status \"4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad\": rpc error: code = NotFound desc = could not find container \"4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad\": container with ID starting with 4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad not found: ID does not exist" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.334678 4742 scope.go:117] "RemoveContainer" containerID="58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804" Mar 17 11:33:20 crc kubenswrapper[4742]: E0317 11:33:20.338785 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804\": container with ID starting with 58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804 not found: ID does not exist" containerID="58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.338867 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804"} err="failed to get container status \"58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804\": rpc error: code = NotFound desc = could not find container \"58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804\": container with ID starting with 58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804 not found: ID does not exist" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.338945 4742 scope.go:117] "RemoveContainer" containerID="4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.339803 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad"} err="failed to get container status \"4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad\": rpc error: code = NotFound desc = could not find container \"4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad\": container with ID starting with 4eb3d99ed77e8fe1d1eb305aac540510d714ed8f0e45edc72c8d9cad3de348ad not found: ID does not exist" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.339846 4742 scope.go:117] "RemoveContainer" containerID="58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.341139 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804"} err="failed to get container status \"58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804\": rpc error: code = NotFound desc = could not find container \"58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804\": container with ID starting with 58061f7f51c015f01766c1ba648e8990e1f91badb309b6e3e0759ee588cd5804 not found: ID does not exist" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.349621 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.355645 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 17 11:33:20 crc kubenswrapper[4742]: E0317 11:33:20.356108 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd851dc-11b1-43f8-9853-9209a5d656c0" containerName="cinder-api" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.356123 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd851dc-11b1-43f8-9853-9209a5d656c0" containerName="cinder-api" Mar 17 11:33:20 crc kubenswrapper[4742]: E0317 11:33:20.356146 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd851dc-11b1-43f8-9853-9209a5d656c0" containerName="cinder-api-log" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.356153 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd851dc-11b1-43f8-9853-9209a5d656c0" containerName="cinder-api-log" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.356353 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd851dc-11b1-43f8-9853-9209a5d656c0" containerName="cinder-api-log" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.356373 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd851dc-11b1-43f8-9853-9209a5d656c0" containerName="cinder-api" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.357714 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.363719 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.363739 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.363747 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.363833 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.488153 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e36e2fb7-b344-4c81-9922-3d9bc9526261-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.488199 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8cz4\" (UniqueName: \"kubernetes.io/projected/e36e2fb7-b344-4c81-9922-3d9bc9526261-kube-api-access-g8cz4\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.488225 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-scripts\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.488253 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.488367 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e36e2fb7-b344-4c81-9922-3d9bc9526261-logs\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.488614 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-config-data\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.488851 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.488897 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-config-data-custom\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.489067 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.550236 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f76787fd-cvxz9"] Mar 17 11:33:20 crc kubenswrapper[4742]: W0317 11:33:20.554025 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf550d045_d552_4ea9_b5c8_a4e7d9ff29a1.slice/crio-5d2f7fdd85819b2d18ba5d8e38ba3438cbd4be0880f5e847756e032587b0f7bb WatchSource:0}: Error finding container 5d2f7fdd85819b2d18ba5d8e38ba3438cbd4be0880f5e847756e032587b0f7bb: Status 404 returned error can't find the container with id 5d2f7fdd85819b2d18ba5d8e38ba3438cbd4be0880f5e847756e032587b0f7bb Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.591138 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.591195 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e36e2fb7-b344-4c81-9922-3d9bc9526261-logs\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.591268 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-config-data\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.591348 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.591374 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-config-data-custom\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.591408 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.591464 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e36e2fb7-b344-4c81-9922-3d9bc9526261-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.591488 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8cz4\" (UniqueName: \"kubernetes.io/projected/e36e2fb7-b344-4c81-9922-3d9bc9526261-kube-api-access-g8cz4\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.591530 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-scripts\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.592672 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e36e2fb7-b344-4c81-9922-3d9bc9526261-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.596189 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e36e2fb7-b344-4c81-9922-3d9bc9526261-logs\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.599073 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.599859 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-config-data-custom\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.601292 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.603742 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-config-data\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.608498 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-scripts\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.610593 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36e2fb7-b344-4c81-9922-3d9bc9526261-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.612862 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8cz4\" (UniqueName: \"kubernetes.io/projected/e36e2fb7-b344-4c81-9922-3d9bc9526261-kube-api-access-g8cz4\") pod \"cinder-api-0\" (UID: \"e36e2fb7-b344-4c81-9922-3d9bc9526261\") " pod="openstack/cinder-api-0" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.677426 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd851dc-11b1-43f8-9853-9209a5d656c0" path="/var/lib/kubelet/pods/6cd851dc-11b1-43f8-9853-9209a5d656c0/volumes" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.678573 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecfcf738-372c-42d4-a4b0-c1f88be1dd43" path="/var/lib/kubelet/pods/ecfcf738-372c-42d4-a4b0-c1f88be1dd43/volumes" Mar 17 11:33:20 crc kubenswrapper[4742]: I0317 11:33:20.693279 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 17 11:33:21 crc kubenswrapper[4742]: W0317 11:33:21.191175 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode36e2fb7_b344_4c81_9922_3d9bc9526261.slice/crio-0670214d07774625742978d5757e7df524353847724977d874398603fe3a2785 WatchSource:0}: Error finding container 0670214d07774625742978d5757e7df524353847724977d874398603fe3a2785: Status 404 returned error can't find the container with id 0670214d07774625742978d5757e7df524353847724977d874398603fe3a2785 Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.193406 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.251853 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e36e2fb7-b344-4c81-9922-3d9bc9526261","Type":"ContainerStarted","Data":"0670214d07774625742978d5757e7df524353847724977d874398603fe3a2785"} Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.254508 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f76787fd-cvxz9" event={"ID":"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1","Type":"ContainerStarted","Data":"2f5afb45ff4429d4cda0f8c454a2181b9b4499097fd94a8ef9344f0def4ed154"} Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.254552 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f76787fd-cvxz9" event={"ID":"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1","Type":"ContainerStarted","Data":"2501a03015d5625347285e17b0d957182d57a4f5dc7b4ebf51ffcc500db0ae77"} Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.254562 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f76787fd-cvxz9" event={"ID":"f550d045-d552-4ea9-b5c8-a4e7d9ff29a1","Type":"ContainerStarted","Data":"5d2f7fdd85819b2d18ba5d8e38ba3438cbd4be0880f5e847756e032587b0f7bb"} Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.256096 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.256190 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.319439 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f76787fd-cvxz9" podStartSLOduration=2.319421551 podStartE2EDuration="2.319421551s" podCreationTimestamp="2026-03-17 11:33:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:33:21.30789824 +0000 UTC m=+1304.434026008" watchObservedRunningTime="2026-03-17 11:33:21.319421551 +0000 UTC m=+1304.445549299" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.346147 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc51fde3-3dea-4f98-ac76-18d3a410fab7","Type":"ContainerStarted","Data":"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0"} Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.346483 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc51fde3-3dea-4f98-ac76-18d3a410fab7","Type":"ContainerStarted","Data":"a3a216842471606a7183e4dc8478d80fa97b1385eb23f7628e685f2bf9b98497"} Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.416220 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.643360 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-774cd45c89-tc5lr"] Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.643865 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-774cd45c89-tc5lr" podUID="ca9f66f5-5921-4f35-a45a-0de69f1a3434" containerName="neutron-httpd" containerID="cri-o://50df2e1afbc3ba5e5be7c6cf914d4d34d955758f6abc13b7f49af5e247bff50c" gracePeriod=30 Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.644040 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-774cd45c89-tc5lr" podUID="ca9f66f5-5921-4f35-a45a-0de69f1a3434" containerName="neutron-api" containerID="cri-o://9cbb8d608c9da7785459131e7c950b27ff8bb5a50ff50434ae31b524c78ba95d" gracePeriod=30 Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.675259 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-774cd45c89-tc5lr" podUID="ca9f66f5-5921-4f35-a45a-0de69f1a3434" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.164:9696/\": EOF" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.691069 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c7d48c699-86xxh"] Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.693402 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.694545 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c7d48c699-86xxh"] Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.819156 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-internal-tls-certs\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.819344 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-public-tls-certs\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.819367 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-config\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.819473 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-combined-ca-bundle\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.819541 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-httpd-config\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.819700 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5z5f\" (UniqueName: \"kubernetes.io/projected/1ccfa960-12b9-4537-b822-89da493f780c-kube-api-access-z5z5f\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.819822 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-ovndb-tls-certs\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.921524 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-public-tls-certs\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.921569 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-config\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.921608 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-combined-ca-bundle\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.921662 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-httpd-config\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.921690 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5z5f\" (UniqueName: \"kubernetes.io/projected/1ccfa960-12b9-4537-b822-89da493f780c-kube-api-access-z5z5f\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.921725 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-ovndb-tls-certs\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.921779 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-internal-tls-certs\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.926218 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-internal-tls-certs\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.926564 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-config\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.926949 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-ovndb-tls-certs\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.927939 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-public-tls-certs\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.929099 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-combined-ca-bundle\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.941576 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1ccfa960-12b9-4537-b822-89da493f780c-httpd-config\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:21 crc kubenswrapper[4742]: I0317 11:33:21.949021 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5z5f\" (UniqueName: \"kubernetes.io/projected/1ccfa960-12b9-4537-b822-89da493f780c-kube-api-access-z5z5f\") pod \"neutron-c7d48c699-86xxh\" (UID: \"1ccfa960-12b9-4537-b822-89da493f780c\") " pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.012122 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.367633 4742 generic.go:334] "Generic (PLEG): container finished" podID="ca9f66f5-5921-4f35-a45a-0de69f1a3434" containerID="50df2e1afbc3ba5e5be7c6cf914d4d34d955758f6abc13b7f49af5e247bff50c" exitCode=0 Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.368177 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-774cd45c89-tc5lr" event={"ID":"ca9f66f5-5921-4f35-a45a-0de69f1a3434","Type":"ContainerDied","Data":"50df2e1afbc3ba5e5be7c6cf914d4d34d955758f6abc13b7f49af5e247bff50c"} Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.396231 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc51fde3-3dea-4f98-ac76-18d3a410fab7","Type":"ContainerStarted","Data":"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb"} Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.402600 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e36e2fb7-b344-4c81-9922-3d9bc9526261","Type":"ContainerStarted","Data":"ae1ede92c9930f3cab911ea9a815eb4012009f4f20eca1bc29b27f84c99638ad"} Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.408293 4742 generic.go:334] "Generic (PLEG): container finished" podID="4eb7c4c3-2dad-464d-8e2c-09e618d140e4" containerID="08a901b0036e08829aab4952f6da9de41d1d594fd0633eb6a7725734021ddeb4" exitCode=137 Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.408336 4742 generic.go:334] "Generic (PLEG): container finished" podID="4eb7c4c3-2dad-464d-8e2c-09e618d140e4" containerID="52611f020d25ddf92efdb5a2f6d79051f256d01eebacede571cf7a38a89a529b" exitCode=137 Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.408369 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-647cff84fc-lltcg" event={"ID":"4eb7c4c3-2dad-464d-8e2c-09e618d140e4","Type":"ContainerDied","Data":"08a901b0036e08829aab4952f6da9de41d1d594fd0633eb6a7725734021ddeb4"} Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.408436 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-647cff84fc-lltcg" event={"ID":"4eb7c4c3-2dad-464d-8e2c-09e618d140e4","Type":"ContainerDied","Data":"52611f020d25ddf92efdb5a2f6d79051f256d01eebacede571cf7a38a89a529b"} Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.601056 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c7d48c699-86xxh"] Mar 17 11:33:22 crc kubenswrapper[4742]: W0317 11:33:22.605690 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ccfa960_12b9_4537_b822_89da493f780c.slice/crio-4469aa59a39989a8b00bd29d7a9d7ba0d975cd6081a5df5e397813d2ffc06286 WatchSource:0}: Error finding container 4469aa59a39989a8b00bd29d7a9d7ba0d975cd6081a5df5e397813d2ffc06286: Status 404 returned error can't find the container with id 4469aa59a39989a8b00bd29d7a9d7ba0d975cd6081a5df5e397813d2ffc06286 Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.610973 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.738298 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-config-data\") pod \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.738423 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-horizon-secret-key\") pod \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.738486 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd9j2\" (UniqueName: \"kubernetes.io/projected/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-kube-api-access-wd9j2\") pod \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.738565 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-scripts\") pod \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.738599 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-logs\") pod \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\" (UID: \"4eb7c4c3-2dad-464d-8e2c-09e618d140e4\") " Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.740606 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-logs" (OuterVolumeSpecName: "logs") pod "4eb7c4c3-2dad-464d-8e2c-09e618d140e4" (UID: "4eb7c4c3-2dad-464d-8e2c-09e618d140e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.744353 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4eb7c4c3-2dad-464d-8e2c-09e618d140e4" (UID: "4eb7c4c3-2dad-464d-8e2c-09e618d140e4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.757078 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-kube-api-access-wd9j2" (OuterVolumeSpecName: "kube-api-access-wd9j2") pod "4eb7c4c3-2dad-464d-8e2c-09e618d140e4" (UID: "4eb7c4c3-2dad-464d-8e2c-09e618d140e4"). InnerVolumeSpecName "kube-api-access-wd9j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.766158 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-config-data" (OuterVolumeSpecName: "config-data") pod "4eb7c4c3-2dad-464d-8e2c-09e618d140e4" (UID: "4eb7c4c3-2dad-464d-8e2c-09e618d140e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.776767 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-scripts" (OuterVolumeSpecName: "scripts") pod "4eb7c4c3-2dad-464d-8e2c-09e618d140e4" (UID: "4eb7c4c3-2dad-464d-8e2c-09e618d140e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.840228 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd9j2\" (UniqueName: \"kubernetes.io/projected/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-kube-api-access-wd9j2\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.840260 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.840269 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.840278 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:22 crc kubenswrapper[4742]: I0317 11:33:22.840288 4742 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eb7c4c3-2dad-464d-8e2c-09e618d140e4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.418868 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc51fde3-3dea-4f98-ac76-18d3a410fab7","Type":"ContainerStarted","Data":"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa"} Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.420477 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e36e2fb7-b344-4c81-9922-3d9bc9526261","Type":"ContainerStarted","Data":"b2ef0b78b6d5fb462bf402ba1ecaf7b23dd21f972422344b060feff8d4bf6d8d"} Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.421013 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.426483 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7d48c699-86xxh" event={"ID":"1ccfa960-12b9-4537-b822-89da493f780c","Type":"ContainerStarted","Data":"e785b3a27bc154c814373621128b765e644b87198ab24e63d3d18713f34793fc"} Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.426516 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7d48c699-86xxh" event={"ID":"1ccfa960-12b9-4537-b822-89da493f780c","Type":"ContainerStarted","Data":"8478c12bd553e89d414c108dcaabd30406341652c619441f0be336b224497d36"} Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.426527 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7d48c699-86xxh" event={"ID":"1ccfa960-12b9-4537-b822-89da493f780c","Type":"ContainerStarted","Data":"4469aa59a39989a8b00bd29d7a9d7ba0d975cd6081a5df5e397813d2ffc06286"} Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.427149 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.432744 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-647cff84fc-lltcg" Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.434205 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-647cff84fc-lltcg" event={"ID":"4eb7c4c3-2dad-464d-8e2c-09e618d140e4","Type":"ContainerDied","Data":"a5089c9fb0b8d6a07ddef3db40f657c3d1eb01707c6f4bd95b306cc377b1207f"} Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.434278 4742 scope.go:117] "RemoveContainer" containerID="08a901b0036e08829aab4952f6da9de41d1d594fd0633eb6a7725734021ddeb4" Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.470816 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.470795657 podStartE2EDuration="3.470795657s" podCreationTimestamp="2026-03-17 11:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:33:23.456044445 +0000 UTC m=+1306.582172213" watchObservedRunningTime="2026-03-17 11:33:23.470795657 +0000 UTC m=+1306.596923415" Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.524217 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c7d48c699-86xxh" podStartSLOduration=2.524198134 podStartE2EDuration="2.524198134s" podCreationTimestamp="2026-03-17 11:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:33:23.48815873 +0000 UTC m=+1306.614286488" watchObservedRunningTime="2026-03-17 11:33:23.524198134 +0000 UTC m=+1306.650325892" Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.549821 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-647cff84fc-lltcg"] Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.565576 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-647cff84fc-lltcg"] Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.577354 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-774cd45c89-tc5lr" podUID="ca9f66f5-5921-4f35-a45a-0de69f1a3434" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.164:9696/\": dial tcp 10.217.0.164:9696: connect: connection refused" Mar 17 11:33:23 crc kubenswrapper[4742]: I0317 11:33:23.665872 4742 scope.go:117] "RemoveContainer" containerID="52611f020d25ddf92efdb5a2f6d79051f256d01eebacede571cf7a38a89a529b" Mar 17 11:33:24 crc kubenswrapper[4742]: I0317 11:33:24.287552 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:33:24 crc kubenswrapper[4742]: I0317 11:33:24.347191 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:33:24 crc kubenswrapper[4742]: I0317 11:33:24.673170 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb7c4c3-2dad-464d-8e2c-09e618d140e4" path="/var/lib/kubelet/pods/4eb7c4c3-2dad-464d-8e2c-09e618d140e4/volumes" Mar 17 11:33:24 crc kubenswrapper[4742]: I0317 11:33:24.928349 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:25 crc kubenswrapper[4742]: I0317 11:33:25.056361 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:25 crc kubenswrapper[4742]: I0317 11:33:25.132646 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 17 11:33:25 crc kubenswrapper[4742]: I0317 11:33:25.371316 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 17 11:33:25 crc kubenswrapper[4742]: I0317 11:33:25.409094 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:33:25 crc kubenswrapper[4742]: I0317 11:33:25.465655 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-pxjc2"] Mar 17 11:33:25 crc kubenswrapper[4742]: I0317 11:33:25.465968 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" podUID="6275127e-0ae2-4d23-8592-ba85c3a7661b" containerName="dnsmasq-dns" containerID="cri-o://4f71c848c99a2374408a16b27e0a2bc9433f250c6dd3ea01e5171e0d2d8cb765" gracePeriod=10 Mar 17 11:33:25 crc kubenswrapper[4742]: I0317 11:33:25.490220 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc51fde3-3dea-4f98-ac76-18d3a410fab7","Type":"ContainerStarted","Data":"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585"} Mar 17 11:33:25 crc kubenswrapper[4742]: I0317 11:33:25.491060 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 11:33:25 crc kubenswrapper[4742]: I0317 11:33:25.534531 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.4992022990000002 podStartE2EDuration="6.53450998s" podCreationTimestamp="2026-03-17 11:33:19 +0000 UTC" firstStartedPulling="2026-03-17 11:33:20.273479447 +0000 UTC m=+1303.399607205" lastFinishedPulling="2026-03-17 11:33:24.308787128 +0000 UTC m=+1307.434914886" observedRunningTime="2026-03-17 11:33:25.523587565 +0000 UTC m=+1308.649715323" watchObservedRunningTime="2026-03-17 11:33:25.53450998 +0000 UTC m=+1308.660637738" Mar 17 11:33:25 crc kubenswrapper[4742]: I0317 11:33:25.584292 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.173468 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.242422 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-ovsdbserver-sb\") pod \"6275127e-0ae2-4d23-8592-ba85c3a7661b\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.242619 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-dns-svc\") pod \"6275127e-0ae2-4d23-8592-ba85c3a7661b\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.242643 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-ovsdbserver-nb\") pod \"6275127e-0ae2-4d23-8592-ba85c3a7661b\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.242674 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-config\") pod \"6275127e-0ae2-4d23-8592-ba85c3a7661b\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.242697 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6f8m\" (UniqueName: \"kubernetes.io/projected/6275127e-0ae2-4d23-8592-ba85c3a7661b-kube-api-access-s6f8m\") pod \"6275127e-0ae2-4d23-8592-ba85c3a7661b\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.242751 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-dns-swift-storage-0\") pod \"6275127e-0ae2-4d23-8592-ba85c3a7661b\" (UID: \"6275127e-0ae2-4d23-8592-ba85c3a7661b\") " Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.266146 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6275127e-0ae2-4d23-8592-ba85c3a7661b-kube-api-access-s6f8m" (OuterVolumeSpecName: "kube-api-access-s6f8m") pod "6275127e-0ae2-4d23-8592-ba85c3a7661b" (UID: "6275127e-0ae2-4d23-8592-ba85c3a7661b"). InnerVolumeSpecName "kube-api-access-s6f8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.340404 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-config" (OuterVolumeSpecName: "config") pod "6275127e-0ae2-4d23-8592-ba85c3a7661b" (UID: "6275127e-0ae2-4d23-8592-ba85c3a7661b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.345484 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.345519 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6f8m\" (UniqueName: \"kubernetes.io/projected/6275127e-0ae2-4d23-8592-ba85c3a7661b-kube-api-access-s6f8m\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.372294 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6275127e-0ae2-4d23-8592-ba85c3a7661b" (UID: "6275127e-0ae2-4d23-8592-ba85c3a7661b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.374636 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6275127e-0ae2-4d23-8592-ba85c3a7661b" (UID: "6275127e-0ae2-4d23-8592-ba85c3a7661b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.377735 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6275127e-0ae2-4d23-8592-ba85c3a7661b" (UID: "6275127e-0ae2-4d23-8592-ba85c3a7661b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.379631 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6275127e-0ae2-4d23-8592-ba85c3a7661b" (UID: "6275127e-0ae2-4d23-8592-ba85c3a7661b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.387958 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.447514 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.447544 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.447554 4742 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.447563 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6275127e-0ae2-4d23-8592-ba85c3a7661b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.500857 4742 generic.go:334] "Generic (PLEG): container finished" podID="6275127e-0ae2-4d23-8592-ba85c3a7661b" containerID="4f71c848c99a2374408a16b27e0a2bc9433f250c6dd3ea01e5171e0d2d8cb765" exitCode=0 Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.500954 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.501048 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" event={"ID":"6275127e-0ae2-4d23-8592-ba85c3a7661b","Type":"ContainerDied","Data":"4f71c848c99a2374408a16b27e0a2bc9433f250c6dd3ea01e5171e0d2d8cb765"} Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.501119 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-pxjc2" event={"ID":"6275127e-0ae2-4d23-8592-ba85c3a7661b","Type":"ContainerDied","Data":"14973ac7c518e0e6a78e0274abc70f735cdf1541353e7b74f225d9b2a8d35f16"} Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.501154 4742 scope.go:117] "RemoveContainer" containerID="4f71c848c99a2374408a16b27e0a2bc9433f250c6dd3ea01e5171e0d2d8cb765" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.501435 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="86924589-f9f6-432e-aa21-0361bbe86f06" containerName="cinder-scheduler" containerID="cri-o://2cd403c6657b46f609799733644e3b131de0e1ebcbc9251385f29c6bb26d3dc5" gracePeriod=30 Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.501624 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="86924589-f9f6-432e-aa21-0361bbe86f06" containerName="probe" containerID="cri-o://e4ffd56f3aa6ed866dfeb45b882d14c6fe20822296aa78b5b7b92c44a0fb9b5c" gracePeriod=30 Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.545016 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-pxjc2"] Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.545055 4742 scope.go:117] "RemoveContainer" containerID="d6e60441db6bd455f12c0cd16ef3b9173ae95b2c9f39155088fdee419abb3bfe" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.553214 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-pxjc2"] Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.571201 4742 scope.go:117] "RemoveContainer" containerID="4f71c848c99a2374408a16b27e0a2bc9433f250c6dd3ea01e5171e0d2d8cb765" Mar 17 11:33:26 crc kubenswrapper[4742]: E0317 11:33:26.571653 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f71c848c99a2374408a16b27e0a2bc9433f250c6dd3ea01e5171e0d2d8cb765\": container with ID starting with 4f71c848c99a2374408a16b27e0a2bc9433f250c6dd3ea01e5171e0d2d8cb765 not found: ID does not exist" containerID="4f71c848c99a2374408a16b27e0a2bc9433f250c6dd3ea01e5171e0d2d8cb765" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.571702 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f71c848c99a2374408a16b27e0a2bc9433f250c6dd3ea01e5171e0d2d8cb765"} err="failed to get container status \"4f71c848c99a2374408a16b27e0a2bc9433f250c6dd3ea01e5171e0d2d8cb765\": rpc error: code = NotFound desc = could not find container \"4f71c848c99a2374408a16b27e0a2bc9433f250c6dd3ea01e5171e0d2d8cb765\": container with ID starting with 4f71c848c99a2374408a16b27e0a2bc9433f250c6dd3ea01e5171e0d2d8cb765 not found: ID does not exist" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.571729 4742 scope.go:117] "RemoveContainer" containerID="d6e60441db6bd455f12c0cd16ef3b9173ae95b2c9f39155088fdee419abb3bfe" Mar 17 11:33:26 crc kubenswrapper[4742]: E0317 11:33:26.572183 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e60441db6bd455f12c0cd16ef3b9173ae95b2c9f39155088fdee419abb3bfe\": container with ID starting with d6e60441db6bd455f12c0cd16ef3b9173ae95b2c9f39155088fdee419abb3bfe not found: ID does not exist" containerID="d6e60441db6bd455f12c0cd16ef3b9173ae95b2c9f39155088fdee419abb3bfe" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.572729 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e60441db6bd455f12c0cd16ef3b9173ae95b2c9f39155088fdee419abb3bfe"} err="failed to get container status \"d6e60441db6bd455f12c0cd16ef3b9173ae95b2c9f39155088fdee419abb3bfe\": rpc error: code = NotFound desc = could not find container \"d6e60441db6bd455f12c0cd16ef3b9173ae95b2c9f39155088fdee419abb3bfe\": container with ID starting with d6e60441db6bd455f12c0cd16ef3b9173ae95b2c9f39155088fdee419abb3bfe not found: ID does not exist" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.675331 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6275127e-0ae2-4d23-8592-ba85c3a7661b" path="/var/lib/kubelet/pods/6275127e-0ae2-4d23-8592-ba85c3a7661b/volumes" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.732235 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5c4556b444-kq454" Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.811542 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cbc75d594-mvhf5"] Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.811741 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cbc75d594-mvhf5" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerName="horizon-log" containerID="cri-o://ae60630080e9cc570d98471ac0c31f7d1dbbc55f4cd0bf7020ef89cc50cc5e24" gracePeriod=30 Mar 17 11:33:26 crc kubenswrapper[4742]: I0317 11:33:26.812346 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cbc75d594-mvhf5" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerName="horizon" containerID="cri-o://bdba66d58b80df369c99a5b4d40c5a7e87b7ad31620cffac6b75800835ccef63" gracePeriod=30 Mar 17 11:33:27 crc kubenswrapper[4742]: I0317 11:33:27.526026 4742 generic.go:334] "Generic (PLEG): container finished" podID="86924589-f9f6-432e-aa21-0361bbe86f06" containerID="e4ffd56f3aa6ed866dfeb45b882d14c6fe20822296aa78b5b7b92c44a0fb9b5c" exitCode=0 Mar 17 11:33:27 crc kubenswrapper[4742]: I0317 11:33:27.526321 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86924589-f9f6-432e-aa21-0361bbe86f06","Type":"ContainerDied","Data":"e4ffd56f3aa6ed866dfeb45b882d14c6fe20822296aa78b5b7b92c44a0fb9b5c"} Mar 17 11:33:27 crc kubenswrapper[4742]: I0317 11:33:27.727480 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.006369 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.341349 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.639279 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6976ff4586-bgqjp"] Mar 17 11:33:28 crc kubenswrapper[4742]: E0317 11:33:28.640140 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb7c4c3-2dad-464d-8e2c-09e618d140e4" containerName="horizon-log" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.640160 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb7c4c3-2dad-464d-8e2c-09e618d140e4" containerName="horizon-log" Mar 17 11:33:28 crc kubenswrapper[4742]: E0317 11:33:28.640192 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb7c4c3-2dad-464d-8e2c-09e618d140e4" containerName="horizon" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.640205 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb7c4c3-2dad-464d-8e2c-09e618d140e4" containerName="horizon" Mar 17 11:33:28 crc kubenswrapper[4742]: E0317 11:33:28.640223 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6275127e-0ae2-4d23-8592-ba85c3a7661b" containerName="dnsmasq-dns" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.640236 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="6275127e-0ae2-4d23-8592-ba85c3a7661b" containerName="dnsmasq-dns" Mar 17 11:33:28 crc kubenswrapper[4742]: E0317 11:33:28.640273 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6275127e-0ae2-4d23-8592-ba85c3a7661b" containerName="init" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.640285 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="6275127e-0ae2-4d23-8592-ba85c3a7661b" containerName="init" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.640568 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb7c4c3-2dad-464d-8e2c-09e618d140e4" containerName="horizon" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.640599 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb7c4c3-2dad-464d-8e2c-09e618d140e4" containerName="horizon-log" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.640635 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="6275127e-0ae2-4d23-8592-ba85c3a7661b" containerName="dnsmasq-dns" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.642176 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.654252 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6976ff4586-bgqjp"] Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.695436 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-config-data\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.695469 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221187ef-dec0-47dd-894e-ff9f2d1daa09-logs\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.695509 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-combined-ca-bundle\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.695616 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-internal-tls-certs\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.695645 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-scripts\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.695688 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w4dg\" (UniqueName: \"kubernetes.io/projected/221187ef-dec0-47dd-894e-ff9f2d1daa09-kube-api-access-4w4dg\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.695731 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-public-tls-certs\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.797228 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-internal-tls-certs\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.797291 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-scripts\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.797350 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w4dg\" (UniqueName: \"kubernetes.io/projected/221187ef-dec0-47dd-894e-ff9f2d1daa09-kube-api-access-4w4dg\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.797398 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-public-tls-certs\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.797449 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-config-data\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.797464 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221187ef-dec0-47dd-894e-ff9f2d1daa09-logs\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.797494 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-combined-ca-bundle\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.799106 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221187ef-dec0-47dd-894e-ff9f2d1daa09-logs\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.803706 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-combined-ca-bundle\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.804322 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-scripts\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.808224 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-internal-tls-certs\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.808262 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-public-tls-certs\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.808686 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221187ef-dec0-47dd-894e-ff9f2d1daa09-config-data\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.840404 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w4dg\" (UniqueName: \"kubernetes.io/projected/221187ef-dec0-47dd-894e-ff9f2d1daa09-kube-api-access-4w4dg\") pod \"placement-6976ff4586-bgqjp\" (UID: \"221187ef-dec0-47dd-894e-ff9f2d1daa09\") " pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:28 crc kubenswrapper[4742]: I0317 11:33:28.958972 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:29 crc kubenswrapper[4742]: I0317 11:33:29.193861 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f76787fd-cvxz9" Mar 17 11:33:29 crc kubenswrapper[4742]: I0317 11:33:29.252293 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76c59bbd5d-b7mv4"] Mar 17 11:33:29 crc kubenswrapper[4742]: I0317 11:33:29.252515 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76c59bbd5d-b7mv4" podUID="38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" containerName="barbican-api-log" containerID="cri-o://b3f970fd22311f013fc289b3df07b5f78e7a4f99a5ffb24d66df51f98fa31fd2" gracePeriod=30 Mar 17 11:33:29 crc kubenswrapper[4742]: I0317 11:33:29.253087 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-76c59bbd5d-b7mv4" podUID="38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" containerName="barbican-api" containerID="cri-o://f98f9505566dbe4a3a4f85287f648f6e3592ea6be5687ab836fe65a33e535a24" gracePeriod=30 Mar 17 11:33:29 crc kubenswrapper[4742]: I0317 11:33:29.462733 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6976ff4586-bgqjp"] Mar 17 11:33:29 crc kubenswrapper[4742]: W0317 11:33:29.466157 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod221187ef_dec0_47dd_894e_ff9f2d1daa09.slice/crio-1ec4b27eac0031b851fa7b73ea370f68039ab244ade7e380845d2bbfe3afe81f WatchSource:0}: Error finding container 1ec4b27eac0031b851fa7b73ea370f68039ab244ade7e380845d2bbfe3afe81f: Status 404 returned error can't find the container with id 1ec4b27eac0031b851fa7b73ea370f68039ab244ade7e380845d2bbfe3afe81f Mar 17 11:33:29 crc kubenswrapper[4742]: I0317 11:33:29.589969 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6976ff4586-bgqjp" event={"ID":"221187ef-dec0-47dd-894e-ff9f2d1daa09","Type":"ContainerStarted","Data":"1ec4b27eac0031b851fa7b73ea370f68039ab244ade7e380845d2bbfe3afe81f"} Mar 17 11:33:29 crc kubenswrapper[4742]: I0317 11:33:29.592231 4742 generic.go:334] "Generic (PLEG): container finished" podID="38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" containerID="b3f970fd22311f013fc289b3df07b5f78e7a4f99a5ffb24d66df51f98fa31fd2" exitCode=143 Mar 17 11:33:29 crc kubenswrapper[4742]: I0317 11:33:29.592270 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c59bbd5d-b7mv4" event={"ID":"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a","Type":"ContainerDied","Data":"b3f970fd22311f013fc289b3df07b5f78e7a4f99a5ffb24d66df51f98fa31fd2"} Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.246096 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.326736 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-config-data\") pod \"86924589-f9f6-432e-aa21-0361bbe86f06\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.327323 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86924589-f9f6-432e-aa21-0361bbe86f06-etc-machine-id\") pod \"86924589-f9f6-432e-aa21-0361bbe86f06\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.327405 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-scripts\") pod \"86924589-f9f6-432e-aa21-0361bbe86f06\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.327430 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-combined-ca-bundle\") pod \"86924589-f9f6-432e-aa21-0361bbe86f06\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.327453 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-config-data-custom\") pod \"86924589-f9f6-432e-aa21-0361bbe86f06\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.327528 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgtjm\" (UniqueName: \"kubernetes.io/projected/86924589-f9f6-432e-aa21-0361bbe86f06-kube-api-access-cgtjm\") pod \"86924589-f9f6-432e-aa21-0361bbe86f06\" (UID: \"86924589-f9f6-432e-aa21-0361bbe86f06\") " Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.329974 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86924589-f9f6-432e-aa21-0361bbe86f06-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "86924589-f9f6-432e-aa21-0361bbe86f06" (UID: "86924589-f9f6-432e-aa21-0361bbe86f06"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.335586 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-scripts" (OuterVolumeSpecName: "scripts") pod "86924589-f9f6-432e-aa21-0361bbe86f06" (UID: "86924589-f9f6-432e-aa21-0361bbe86f06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.337104 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86924589-f9f6-432e-aa21-0361bbe86f06-kube-api-access-cgtjm" (OuterVolumeSpecName: "kube-api-access-cgtjm") pod "86924589-f9f6-432e-aa21-0361bbe86f06" (UID: "86924589-f9f6-432e-aa21-0361bbe86f06"). InnerVolumeSpecName "kube-api-access-cgtjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.346057 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "86924589-f9f6-432e-aa21-0361bbe86f06" (UID: "86924589-f9f6-432e-aa21-0361bbe86f06"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.382655 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86924589-f9f6-432e-aa21-0361bbe86f06" (UID: "86924589-f9f6-432e-aa21-0361bbe86f06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.429130 4742 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/86924589-f9f6-432e-aa21-0361bbe86f06-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.429160 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.429169 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.429179 4742 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.429187 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgtjm\" (UniqueName: \"kubernetes.io/projected/86924589-f9f6-432e-aa21-0361bbe86f06-kube-api-access-cgtjm\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.464231 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-config-data" (OuterVolumeSpecName: "config-data") pod "86924589-f9f6-432e-aa21-0361bbe86f06" (UID: "86924589-f9f6-432e-aa21-0361bbe86f06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.530615 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86924589-f9f6-432e-aa21-0361bbe86f06-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.602220 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6976ff4586-bgqjp" event={"ID":"221187ef-dec0-47dd-894e-ff9f2d1daa09","Type":"ContainerStarted","Data":"e4149d43e1072709b58d9dd7a630a5f585ff3c419edb0525bd66b67cd986a470"} Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.602263 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6976ff4586-bgqjp" event={"ID":"221187ef-dec0-47dd-894e-ff9f2d1daa09","Type":"ContainerStarted","Data":"e74bc976bbb0d8ac1d575947e53dfcb102f7809c010199b333679fa0a3fbbe93"} Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.603705 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.603731 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.606730 4742 generic.go:334] "Generic (PLEG): container finished" podID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerID="bdba66d58b80df369c99a5b4d40c5a7e87b7ad31620cffac6b75800835ccef63" exitCode=0 Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.606772 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbc75d594-mvhf5" event={"ID":"f2bbef92-cd02-42d8-b81d-ab7248e29328","Type":"ContainerDied","Data":"bdba66d58b80df369c99a5b4d40c5a7e87b7ad31620cffac6b75800835ccef63"} Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.626469 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6976ff4586-bgqjp" podStartSLOduration=2.626449801 podStartE2EDuration="2.626449801s" podCreationTimestamp="2026-03-17 11:33:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:33:30.624861787 +0000 UTC m=+1313.750989545" watchObservedRunningTime="2026-03-17 11:33:30.626449801 +0000 UTC m=+1313.752577559" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.626583 4742 generic.go:334] "Generic (PLEG): container finished" podID="86924589-f9f6-432e-aa21-0361bbe86f06" containerID="2cd403c6657b46f609799733644e3b131de0e1ebcbc9251385f29c6bb26d3dc5" exitCode=0 Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.626624 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86924589-f9f6-432e-aa21-0361bbe86f06","Type":"ContainerDied","Data":"2cd403c6657b46f609799733644e3b131de0e1ebcbc9251385f29c6bb26d3dc5"} Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.626646 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.626660 4742 scope.go:117] "RemoveContainer" containerID="e4ffd56f3aa6ed866dfeb45b882d14c6fe20822296aa78b5b7b92c44a0fb9b5c" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.626648 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"86924589-f9f6-432e-aa21-0361bbe86f06","Type":"ContainerDied","Data":"0ec897e190c6e6a3522e16c7367cb3e55a327c19a86f7fc40021001955737c14"} Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.663521 4742 scope.go:117] "RemoveContainer" containerID="2cd403c6657b46f609799733644e3b131de0e1ebcbc9251385f29c6bb26d3dc5" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.674388 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.690970 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.694216 4742 scope.go:117] "RemoveContainer" containerID="e4ffd56f3aa6ed866dfeb45b882d14c6fe20822296aa78b5b7b92c44a0fb9b5c" Mar 17 11:33:30 crc kubenswrapper[4742]: E0317 11:33:30.715023 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4ffd56f3aa6ed866dfeb45b882d14c6fe20822296aa78b5b7b92c44a0fb9b5c\": container with ID starting with e4ffd56f3aa6ed866dfeb45b882d14c6fe20822296aa78b5b7b92c44a0fb9b5c not found: ID does not exist" containerID="e4ffd56f3aa6ed866dfeb45b882d14c6fe20822296aa78b5b7b92c44a0fb9b5c" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.715070 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ffd56f3aa6ed866dfeb45b882d14c6fe20822296aa78b5b7b92c44a0fb9b5c"} err="failed to get container status \"e4ffd56f3aa6ed866dfeb45b882d14c6fe20822296aa78b5b7b92c44a0fb9b5c\": rpc error: code = NotFound desc = could not find container \"e4ffd56f3aa6ed866dfeb45b882d14c6fe20822296aa78b5b7b92c44a0fb9b5c\": container with ID starting with e4ffd56f3aa6ed866dfeb45b882d14c6fe20822296aa78b5b7b92c44a0fb9b5c not found: ID does not exist" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.715093 4742 scope.go:117] "RemoveContainer" containerID="2cd403c6657b46f609799733644e3b131de0e1ebcbc9251385f29c6bb26d3dc5" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.715178 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 11:33:30 crc kubenswrapper[4742]: E0317 11:33:30.715505 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86924589-f9f6-432e-aa21-0361bbe86f06" containerName="cinder-scheduler" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.715522 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="86924589-f9f6-432e-aa21-0361bbe86f06" containerName="cinder-scheduler" Mar 17 11:33:30 crc kubenswrapper[4742]: E0317 11:33:30.715534 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86924589-f9f6-432e-aa21-0361bbe86f06" containerName="probe" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.715541 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="86924589-f9f6-432e-aa21-0361bbe86f06" containerName="probe" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.715710 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="86924589-f9f6-432e-aa21-0361bbe86f06" containerName="probe" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.715728 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="86924589-f9f6-432e-aa21-0361bbe86f06" containerName="cinder-scheduler" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.716571 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.721198 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 17 11:33:30 crc kubenswrapper[4742]: E0317 11:33:30.729039 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd403c6657b46f609799733644e3b131de0e1ebcbc9251385f29c6bb26d3dc5\": container with ID starting with 2cd403c6657b46f609799733644e3b131de0e1ebcbc9251385f29c6bb26d3dc5 not found: ID does not exist" containerID="2cd403c6657b46f609799733644e3b131de0e1ebcbc9251385f29c6bb26d3dc5" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.729085 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd403c6657b46f609799733644e3b131de0e1ebcbc9251385f29c6bb26d3dc5"} err="failed to get container status \"2cd403c6657b46f609799733644e3b131de0e1ebcbc9251385f29c6bb26d3dc5\": rpc error: code = NotFound desc = could not find container \"2cd403c6657b46f609799733644e3b131de0e1ebcbc9251385f29c6bb26d3dc5\": container with ID starting with 2cd403c6657b46f609799733644e3b131de0e1ebcbc9251385f29c6bb26d3dc5 not found: ID does not exist" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.747495 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.859448 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f6380b-463f-4c5b-9c4a-809c874b2ca5-config-data\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.859779 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12f6380b-463f-4c5b-9c4a-809c874b2ca5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.859809 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12f6380b-463f-4c5b-9c4a-809c874b2ca5-scripts\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.859845 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12f6380b-463f-4c5b-9c4a-809c874b2ca5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.859861 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f6380b-463f-4c5b-9c4a-809c874b2ca5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.859902 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dz77\" (UniqueName: \"kubernetes.io/projected/12f6380b-463f-4c5b-9c4a-809c874b2ca5-kube-api-access-6dz77\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.961846 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12f6380b-463f-4c5b-9c4a-809c874b2ca5-scripts\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.961927 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12f6380b-463f-4c5b-9c4a-809c874b2ca5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.961944 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f6380b-463f-4c5b-9c4a-809c874b2ca5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.961985 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dz77\" (UniqueName: \"kubernetes.io/projected/12f6380b-463f-4c5b-9c4a-809c874b2ca5-kube-api-access-6dz77\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.962092 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f6380b-463f-4c5b-9c4a-809c874b2ca5-config-data\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.962109 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12f6380b-463f-4c5b-9c4a-809c874b2ca5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.962175 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12f6380b-463f-4c5b-9c4a-809c874b2ca5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.966040 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12f6380b-463f-4c5b-9c4a-809c874b2ca5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.967213 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12f6380b-463f-4c5b-9c4a-809c874b2ca5-scripts\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.968984 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12f6380b-463f-4c5b-9c4a-809c874b2ca5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.973616 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12f6380b-463f-4c5b-9c4a-809c874b2ca5-config-data\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:30 crc kubenswrapper[4742]: I0317 11:33:30.979812 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dz77\" (UniqueName: \"kubernetes.io/projected/12f6380b-463f-4c5b-9c4a-809c874b2ca5-kube-api-access-6dz77\") pod \"cinder-scheduler-0\" (UID: \"12f6380b-463f-4c5b-9c4a-809c874b2ca5\") " pod="openstack/cinder-scheduler-0" Mar 17 11:33:31 crc kubenswrapper[4742]: I0317 11:33:31.052168 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 17 11:33:31 crc kubenswrapper[4742]: I0317 11:33:31.518146 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 17 11:33:31 crc kubenswrapper[4742]: I0317 11:33:31.644520 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"12f6380b-463f-4c5b-9c4a-809c874b2ca5","Type":"ContainerStarted","Data":"65a28d7fab2764efdb5d48059853068605dcbe505c58ee7768f55432922ca2be"} Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.117768 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5cbc75d594-mvhf5" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.694204 4742 generic.go:334] "Generic (PLEG): container finished" podID="38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" containerID="f98f9505566dbe4a3a4f85287f648f6e3592ea6be5687ab836fe65a33e535a24" exitCode=0 Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.696595 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86924589-f9f6-432e-aa21-0361bbe86f06" path="/var/lib/kubelet/pods/86924589-f9f6-432e-aa21-0361bbe86f06/volumes" Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.697287 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c59bbd5d-b7mv4" event={"ID":"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a","Type":"ContainerDied","Data":"f98f9505566dbe4a3a4f85287f648f6e3592ea6be5687ab836fe65a33e535a24"} Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.697348 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.703984 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"12f6380b-463f-4c5b-9c4a-809c874b2ca5","Type":"ContainerStarted","Data":"bd15413c7b19638d2ef670c8a577e07072d0da131580e10b9b7ae238db3dbfe1"} Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.844187 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.902492 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-config-data\") pod \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.902556 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-logs\") pod \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.902658 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-combined-ca-bundle\") pod \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.902690 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d42r\" (UniqueName: \"kubernetes.io/projected/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-kube-api-access-5d42r\") pod \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.902723 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-config-data-custom\") pod \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\" (UID: \"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a\") " Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.906321 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-logs" (OuterVolumeSpecName: "logs") pod "38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" (UID: "38dc6520-ca66-44cf-bd2d-6d65bf57ff3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.928243 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" (UID: "38dc6520-ca66-44cf-bd2d-6d65bf57ff3a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.932092 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-kube-api-access-5d42r" (OuterVolumeSpecName: "kube-api-access-5d42r") pod "38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" (UID: "38dc6520-ca66-44cf-bd2d-6d65bf57ff3a"). InnerVolumeSpecName "kube-api-access-5d42r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:32 crc kubenswrapper[4742]: I0317 11:33:32.974036 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" (UID: "38dc6520-ca66-44cf-bd2d-6d65bf57ff3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.005461 4742 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.005489 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.005500 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.005510 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d42r\" (UniqueName: \"kubernetes.io/projected/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-kube-api-access-5d42r\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.043701 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-config-data" (OuterVolumeSpecName: "config-data") pod "38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" (UID: "38dc6520-ca66-44cf-bd2d-6d65bf57ff3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.107246 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.477352 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-cf69c6b9b-d9hmq" Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.715240 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"12f6380b-463f-4c5b-9c4a-809c874b2ca5","Type":"ContainerStarted","Data":"540b000acd6427abc962f6b4761c0a4f57604adc10b0d50424feb71f910388c0"} Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.717271 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76c59bbd5d-b7mv4" event={"ID":"38dc6520-ca66-44cf-bd2d-6d65bf57ff3a","Type":"ContainerDied","Data":"c43fed673a14414c2d07fee00ae8ee07cea5c016a3751834743d19f4e301e6aa"} Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.717296 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76c59bbd5d-b7mv4" Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.717326 4742 scope.go:117] "RemoveContainer" containerID="f98f9505566dbe4a3a4f85287f648f6e3592ea6be5687ab836fe65a33e535a24" Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.736483 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.7364698990000003 podStartE2EDuration="3.736469899s" podCreationTimestamp="2026-03-17 11:33:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:33:33.732837338 +0000 UTC m=+1316.858965096" watchObservedRunningTime="2026-03-17 11:33:33.736469899 +0000 UTC m=+1316.862597657" Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.747928 4742 scope.go:117] "RemoveContainer" containerID="b3f970fd22311f013fc289b3df07b5f78e7a4f99a5ffb24d66df51f98fa31fd2" Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.763972 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-76c59bbd5d-b7mv4"] Mar 17 11:33:33 crc kubenswrapper[4742]: I0317 11:33:33.771429 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-76c59bbd5d-b7mv4"] Mar 17 11:33:34 crc kubenswrapper[4742]: I0317 11:33:34.685247 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" path="/var/lib/kubelet/pods/38dc6520-ca66-44cf-bd2d-6d65bf57ff3a/volumes" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.052406 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.390333 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.470803 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-combined-ca-bundle\") pod \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.470855 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6njh\" (UniqueName: \"kubernetes.io/projected/ca9f66f5-5921-4f35-a45a-0de69f1a3434-kube-api-access-m6njh\") pod \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.470933 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-public-tls-certs\") pod \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.470963 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-internal-tls-certs\") pod \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.471018 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-ovndb-tls-certs\") pod \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.471099 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-httpd-config\") pod \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.471223 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-config\") pod \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\" (UID: \"ca9f66f5-5921-4f35-a45a-0de69f1a3434\") " Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.487452 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9f66f5-5921-4f35-a45a-0de69f1a3434-kube-api-access-m6njh" (OuterVolumeSpecName: "kube-api-access-m6njh") pod "ca9f66f5-5921-4f35-a45a-0de69f1a3434" (UID: "ca9f66f5-5921-4f35-a45a-0de69f1a3434"). InnerVolumeSpecName "kube-api-access-m6njh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.489074 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ca9f66f5-5921-4f35-a45a-0de69f1a3434" (UID: "ca9f66f5-5921-4f35-a45a-0de69f1a3434"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.519997 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ca9f66f5-5921-4f35-a45a-0de69f1a3434" (UID: "ca9f66f5-5921-4f35-a45a-0de69f1a3434"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.550330 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-config" (OuterVolumeSpecName: "config") pod "ca9f66f5-5921-4f35-a45a-0de69f1a3434" (UID: "ca9f66f5-5921-4f35-a45a-0de69f1a3434"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.555313 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ca9f66f5-5921-4f35-a45a-0de69f1a3434" (UID: "ca9f66f5-5921-4f35-a45a-0de69f1a3434"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.573224 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.573272 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6njh\" (UniqueName: \"kubernetes.io/projected/ca9f66f5-5921-4f35-a45a-0de69f1a3434-kube-api-access-m6njh\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.573288 4742 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.573300 4742 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.573312 4742 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.575416 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ca9f66f5-5921-4f35-a45a-0de69f1a3434" (UID: "ca9f66f5-5921-4f35-a45a-0de69f1a3434"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.576731 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca9f66f5-5921-4f35-a45a-0de69f1a3434" (UID: "ca9f66f5-5921-4f35-a45a-0de69f1a3434"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.674649 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.674683 4742 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca9f66f5-5921-4f35-a45a-0de69f1a3434-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.750666 4742 generic.go:334] "Generic (PLEG): container finished" podID="ca9f66f5-5921-4f35-a45a-0de69f1a3434" containerID="9cbb8d608c9da7785459131e7c950b27ff8bb5a50ff50434ae31b524c78ba95d" exitCode=0 Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.750728 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-774cd45c89-tc5lr" event={"ID":"ca9f66f5-5921-4f35-a45a-0de69f1a3434","Type":"ContainerDied","Data":"9cbb8d608c9da7785459131e7c950b27ff8bb5a50ff50434ae31b524c78ba95d"} Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.750760 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-774cd45c89-tc5lr" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.750798 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-774cd45c89-tc5lr" event={"ID":"ca9f66f5-5921-4f35-a45a-0de69f1a3434","Type":"ContainerDied","Data":"169676225e070c8b5d3d0453469f9c25590bc2576aafb8b58c5b3a187de378e8"} Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.750825 4742 scope.go:117] "RemoveContainer" containerID="50df2e1afbc3ba5e5be7c6cf914d4d34d955758f6abc13b7f49af5e247bff50c" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.783742 4742 scope.go:117] "RemoveContainer" containerID="9cbb8d608c9da7785459131e7c950b27ff8bb5a50ff50434ae31b524c78ba95d" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.784798 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-774cd45c89-tc5lr"] Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.795096 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-774cd45c89-tc5lr"] Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.822403 4742 scope.go:117] "RemoveContainer" containerID="50df2e1afbc3ba5e5be7c6cf914d4d34d955758f6abc13b7f49af5e247bff50c" Mar 17 11:33:36 crc kubenswrapper[4742]: E0317 11:33:36.822897 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50df2e1afbc3ba5e5be7c6cf914d4d34d955758f6abc13b7f49af5e247bff50c\": container with ID starting with 50df2e1afbc3ba5e5be7c6cf914d4d34d955758f6abc13b7f49af5e247bff50c not found: ID does not exist" containerID="50df2e1afbc3ba5e5be7c6cf914d4d34d955758f6abc13b7f49af5e247bff50c" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.822964 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50df2e1afbc3ba5e5be7c6cf914d4d34d955758f6abc13b7f49af5e247bff50c"} err="failed to get container status \"50df2e1afbc3ba5e5be7c6cf914d4d34d955758f6abc13b7f49af5e247bff50c\": rpc error: code = NotFound desc = could not find container \"50df2e1afbc3ba5e5be7c6cf914d4d34d955758f6abc13b7f49af5e247bff50c\": container with ID starting with 50df2e1afbc3ba5e5be7c6cf914d4d34d955758f6abc13b7f49af5e247bff50c not found: ID does not exist" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.823014 4742 scope.go:117] "RemoveContainer" containerID="9cbb8d608c9da7785459131e7c950b27ff8bb5a50ff50434ae31b524c78ba95d" Mar 17 11:33:36 crc kubenswrapper[4742]: E0317 11:33:36.823321 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cbb8d608c9da7785459131e7c950b27ff8bb5a50ff50434ae31b524c78ba95d\": container with ID starting with 9cbb8d608c9da7785459131e7c950b27ff8bb5a50ff50434ae31b524c78ba95d not found: ID does not exist" containerID="9cbb8d608c9da7785459131e7c950b27ff8bb5a50ff50434ae31b524c78ba95d" Mar 17 11:33:36 crc kubenswrapper[4742]: I0317 11:33:36.823356 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cbb8d608c9da7785459131e7c950b27ff8bb5a50ff50434ae31b524c78ba95d"} err="failed to get container status \"9cbb8d608c9da7785459131e7c950b27ff8bb5a50ff50434ae31b524c78ba95d\": rpc error: code = NotFound desc = could not find container \"9cbb8d608c9da7785459131e7c950b27ff8bb5a50ff50434ae31b524c78ba95d\": container with ID starting with 9cbb8d608c9da7785459131e7c950b27ff8bb5a50ff50434ae31b524c78ba95d not found: ID does not exist" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.134074 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-c96b95bb7-ckpvc"] Mar 17 11:33:37 crc kubenswrapper[4742]: E0317 11:33:37.134796 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" containerName="barbican-api-log" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.134813 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" containerName="barbican-api-log" Mar 17 11:33:37 crc kubenswrapper[4742]: E0317 11:33:37.134839 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9f66f5-5921-4f35-a45a-0de69f1a3434" containerName="neutron-api" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.134847 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9f66f5-5921-4f35-a45a-0de69f1a3434" containerName="neutron-api" Mar 17 11:33:37 crc kubenswrapper[4742]: E0317 11:33:37.134882 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9f66f5-5921-4f35-a45a-0de69f1a3434" containerName="neutron-httpd" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.134890 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9f66f5-5921-4f35-a45a-0de69f1a3434" containerName="neutron-httpd" Mar 17 11:33:37 crc kubenswrapper[4742]: E0317 11:33:37.134902 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" containerName="barbican-api" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.135018 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" containerName="barbican-api" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.135233 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca9f66f5-5921-4f35-a45a-0de69f1a3434" containerName="neutron-httpd" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.135258 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" containerName="barbican-api-log" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.135270 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="38dc6520-ca66-44cf-bd2d-6d65bf57ff3a" containerName="barbican-api" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.135285 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca9f66f5-5921-4f35-a45a-0de69f1a3434" containerName="neutron-api" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.136457 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.139607 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.139931 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.140145 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.150927 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-c96b95bb7-ckpvc"] Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.286673 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-internal-tls-certs\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.286722 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-config-data\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.286756 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-log-httpd\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.286777 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snj7p\" (UniqueName: \"kubernetes.io/projected/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-kube-api-access-snj7p\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.286797 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-combined-ca-bundle\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.286828 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-etc-swift\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.286875 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-run-httpd\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.286891 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-public-tls-certs\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.390675 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-internal-tls-certs\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.390721 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-config-data\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.390743 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-log-httpd\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.390760 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snj7p\" (UniqueName: \"kubernetes.io/projected/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-kube-api-access-snj7p\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.390779 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-combined-ca-bundle\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.390809 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-etc-swift\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.390852 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-run-httpd\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.390869 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-public-tls-certs\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.392548 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-run-httpd\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.392558 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-log-httpd\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.396863 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-config-data\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.397497 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-internal-tls-certs\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.398586 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-combined-ca-bundle\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.401495 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-public-tls-certs\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.407839 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-etc-swift\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.408215 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snj7p\" (UniqueName: \"kubernetes.io/projected/b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe-kube-api-access-snj7p\") pod \"swift-proxy-c96b95bb7-ckpvc\" (UID: \"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe\") " pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:37 crc kubenswrapper[4742]: I0317 11:33:37.454079 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.007397 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.007893 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="ceilometer-central-agent" containerID="cri-o://f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0" gracePeriod=30 Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.010565 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="proxy-httpd" containerID="cri-o://3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585" gracePeriod=30 Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.010629 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="sg-core" containerID="cri-o://d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa" gracePeriod=30 Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.010667 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="ceilometer-notification-agent" containerID="cri-o://bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb" gracePeriod=30 Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.019201 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.021108 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-c96b95bb7-ckpvc"] Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.466950 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.469033 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.471385 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.471705 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.471880 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-fqcl4" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.502841 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.622409 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xww98\" (UniqueName: \"kubernetes.io/projected/11e12da8-9e80-453f-bbbd-03d1346afe5b-kube-api-access-xww98\") pod \"openstackclient\" (UID: \"11e12da8-9e80-453f-bbbd-03d1346afe5b\") " pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.622539 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e12da8-9e80-453f-bbbd-03d1346afe5b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"11e12da8-9e80-453f-bbbd-03d1346afe5b\") " pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.622614 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11e12da8-9e80-453f-bbbd-03d1346afe5b-openstack-config-secret\") pod \"openstackclient\" (UID: \"11e12da8-9e80-453f-bbbd-03d1346afe5b\") " pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.622650 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11e12da8-9e80-453f-bbbd-03d1346afe5b-openstack-config\") pod \"openstackclient\" (UID: \"11e12da8-9e80-453f-bbbd-03d1346afe5b\") " pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.680032 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca9f66f5-5921-4f35-a45a-0de69f1a3434" path="/var/lib/kubelet/pods/ca9f66f5-5921-4f35-a45a-0de69f1a3434/volumes" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.724356 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11e12da8-9e80-453f-bbbd-03d1346afe5b-openstack-config-secret\") pod \"openstackclient\" (UID: \"11e12da8-9e80-453f-bbbd-03d1346afe5b\") " pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.724421 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11e12da8-9e80-453f-bbbd-03d1346afe5b-openstack-config\") pod \"openstackclient\" (UID: \"11e12da8-9e80-453f-bbbd-03d1346afe5b\") " pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.724483 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xww98\" (UniqueName: \"kubernetes.io/projected/11e12da8-9e80-453f-bbbd-03d1346afe5b-kube-api-access-xww98\") pod \"openstackclient\" (UID: \"11e12da8-9e80-453f-bbbd-03d1346afe5b\") " pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.724571 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e12da8-9e80-453f-bbbd-03d1346afe5b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"11e12da8-9e80-453f-bbbd-03d1346afe5b\") " pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.730243 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.730253 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.731142 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11e12da8-9e80-453f-bbbd-03d1346afe5b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"11e12da8-9e80-453f-bbbd-03d1346afe5b\") " pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.736071 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.736217 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/11e12da8-9e80-453f-bbbd-03d1346afe5b-openstack-config\") pod \"openstackclient\" (UID: \"11e12da8-9e80-453f-bbbd-03d1346afe5b\") " pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.739880 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/11e12da8-9e80-453f-bbbd-03d1346afe5b-openstack-config-secret\") pod \"openstackclient\" (UID: \"11e12da8-9e80-453f-bbbd-03d1346afe5b\") " pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.743360 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xww98\" (UniqueName: \"kubernetes.io/projected/11e12da8-9e80-453f-bbbd-03d1346afe5b-kube-api-access-xww98\") pod \"openstackclient\" (UID: \"11e12da8-9e80-453f-bbbd-03d1346afe5b\") " pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.783524 4742 generic.go:334] "Generic (PLEG): container finished" podID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerID="3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585" exitCode=0 Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.783555 4742 generic.go:334] "Generic (PLEG): container finished" podID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerID="d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa" exitCode=2 Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.783564 4742 generic.go:334] "Generic (PLEG): container finished" podID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerID="bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb" exitCode=0 Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.783572 4742 generic.go:334] "Generic (PLEG): container finished" podID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerID="f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0" exitCode=0 Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.783608 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc51fde3-3dea-4f98-ac76-18d3a410fab7","Type":"ContainerDied","Data":"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585"} Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.783632 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc51fde3-3dea-4f98-ac76-18d3a410fab7","Type":"ContainerDied","Data":"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa"} Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.783643 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc51fde3-3dea-4f98-ac76-18d3a410fab7","Type":"ContainerDied","Data":"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb"} Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.783651 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc51fde3-3dea-4f98-ac76-18d3a410fab7","Type":"ContainerDied","Data":"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0"} Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.783659 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc51fde3-3dea-4f98-ac76-18d3a410fab7","Type":"ContainerDied","Data":"a3a216842471606a7183e4dc8478d80fa97b1385eb23f7628e685f2bf9b98497"} Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.783674 4742 scope.go:117] "RemoveContainer" containerID="3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.783774 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.787767 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c96b95bb7-ckpvc" event={"ID":"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe","Type":"ContainerStarted","Data":"238bfde84c34a4fa5d724d1bdc7ae380517f7fefca6b365e5afe095e06e36fe9"} Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.787821 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c96b95bb7-ckpvc" event={"ID":"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe","Type":"ContainerStarted","Data":"d0dcbfebad922ac8074c23e66923e8b37579c277ad03b53129c411a465730b3d"} Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.787832 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-c96b95bb7-ckpvc" event={"ID":"b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe","Type":"ContainerStarted","Data":"eab55067e7c194d4f23277941e14a196f9e07dfb476a06c8343e76701ec178d7"} Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.788511 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.788634 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.803183 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-fqcl4" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.823238 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.825455 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-combined-ca-bundle\") pod \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.825695 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-config-data\") pod \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.825795 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc51fde3-3dea-4f98-ac76-18d3a410fab7-run-httpd\") pod \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.828784 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc51fde3-3dea-4f98-ac76-18d3a410fab7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cc51fde3-3dea-4f98-ac76-18d3a410fab7" (UID: "cc51fde3-3dea-4f98-ac76-18d3a410fab7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.829074 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-scripts\") pod \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.829177 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p47m\" (UniqueName: \"kubernetes.io/projected/cc51fde3-3dea-4f98-ac76-18d3a410fab7-kube-api-access-2p47m\") pod \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.829234 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc51fde3-3dea-4f98-ac76-18d3a410fab7-log-httpd\") pod \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.829266 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-sg-core-conf-yaml\") pod \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.831750 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc51fde3-3dea-4f98-ac76-18d3a410fab7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cc51fde3-3dea-4f98-ac76-18d3a410fab7" (UID: "cc51fde3-3dea-4f98-ac76-18d3a410fab7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.833087 4742 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc51fde3-3dea-4f98-ac76-18d3a410fab7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.833104 4742 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc51fde3-3dea-4f98-ac76-18d3a410fab7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.834336 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-scripts" (OuterVolumeSpecName: "scripts") pod "cc51fde3-3dea-4f98-ac76-18d3a410fab7" (UID: "cc51fde3-3dea-4f98-ac76-18d3a410fab7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.838576 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc51fde3-3dea-4f98-ac76-18d3a410fab7-kube-api-access-2p47m" (OuterVolumeSpecName: "kube-api-access-2p47m") pod "cc51fde3-3dea-4f98-ac76-18d3a410fab7" (UID: "cc51fde3-3dea-4f98-ac76-18d3a410fab7"). InnerVolumeSpecName "kube-api-access-2p47m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.838930 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-c96b95bb7-ckpvc" podStartSLOduration=1.838892663 podStartE2EDuration="1.838892663s" podCreationTimestamp="2026-03-17 11:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:33:38.817698103 +0000 UTC m=+1321.943825851" watchObservedRunningTime="2026-03-17 11:33:38.838892663 +0000 UTC m=+1321.965020421" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.851679 4742 scope.go:117] "RemoveContainer" containerID="d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.868572 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cc51fde3-3dea-4f98-ac76-18d3a410fab7" (UID: "cc51fde3-3dea-4f98-ac76-18d3a410fab7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.921864 4742 scope.go:117] "RemoveContainer" containerID="bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.936667 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc51fde3-3dea-4f98-ac76-18d3a410fab7" (UID: "cc51fde3-3dea-4f98-ac76-18d3a410fab7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.937542 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-combined-ca-bundle\") pod \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\" (UID: \"cc51fde3-3dea-4f98-ac76-18d3a410fab7\") " Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.938252 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.938271 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p47m\" (UniqueName: \"kubernetes.io/projected/cc51fde3-3dea-4f98-ac76-18d3a410fab7-kube-api-access-2p47m\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.938280 4742 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:38 crc kubenswrapper[4742]: W0317 11:33:38.939123 4742 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/cc51fde3-3dea-4f98-ac76-18d3a410fab7/volumes/kubernetes.io~secret/combined-ca-bundle Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.939144 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc51fde3-3dea-4f98-ac76-18d3a410fab7" (UID: "cc51fde3-3dea-4f98-ac76-18d3a410fab7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.939992 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-lsskf"] Mar 17 11:33:38 crc kubenswrapper[4742]: E0317 11:33:38.940426 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="sg-core" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.940439 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="sg-core" Mar 17 11:33:38 crc kubenswrapper[4742]: E0317 11:33:38.940449 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="proxy-httpd" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.940455 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="proxy-httpd" Mar 17 11:33:38 crc kubenswrapper[4742]: E0317 11:33:38.940462 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="ceilometer-notification-agent" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.940469 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="ceilometer-notification-agent" Mar 17 11:33:38 crc kubenswrapper[4742]: E0317 11:33:38.940492 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="ceilometer-central-agent" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.940513 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="ceilometer-central-agent" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.940701 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="ceilometer-central-agent" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.940710 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="sg-core" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.940722 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="proxy-httpd" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.940730 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" containerName="ceilometer-notification-agent" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.941269 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lsskf" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.958945 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lsskf"] Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.982150 4742 scope.go:117] "RemoveContainer" containerID="f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0" Mar 17 11:33:38 crc kubenswrapper[4742]: I0317 11:33:38.990056 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-config-data" (OuterVolumeSpecName: "config-data") pod "cc51fde3-3dea-4f98-ac76-18d3a410fab7" (UID: "cc51fde3-3dea-4f98-ac76-18d3a410fab7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.004564 4742 scope.go:117] "RemoveContainer" containerID="3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585" Mar 17 11:33:39 crc kubenswrapper[4742]: E0317 11:33:39.005316 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585\": container with ID starting with 3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585 not found: ID does not exist" containerID="3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.005360 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585"} err="failed to get container status \"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585\": rpc error: code = NotFound desc = could not find container \"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585\": container with ID starting with 3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585 not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.005379 4742 scope.go:117] "RemoveContainer" containerID="d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa" Mar 17 11:33:39 crc kubenswrapper[4742]: E0317 11:33:39.005586 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa\": container with ID starting with d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa not found: ID does not exist" containerID="d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.005605 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa"} err="failed to get container status \"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa\": rpc error: code = NotFound desc = could not find container \"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa\": container with ID starting with d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.005617 4742 scope.go:117] "RemoveContainer" containerID="bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb" Mar 17 11:33:39 crc kubenswrapper[4742]: E0317 11:33:39.005782 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb\": container with ID starting with bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb not found: ID does not exist" containerID="bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.005813 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb"} err="failed to get container status \"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb\": rpc error: code = NotFound desc = could not find container \"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb\": container with ID starting with bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.005824 4742 scope.go:117] "RemoveContainer" containerID="f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0" Mar 17 11:33:39 crc kubenswrapper[4742]: E0317 11:33:39.005997 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0\": container with ID starting with f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0 not found: ID does not exist" containerID="f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.006013 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0"} err="failed to get container status \"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0\": rpc error: code = NotFound desc = could not find container \"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0\": container with ID starting with f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0 not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.006023 4742 scope.go:117] "RemoveContainer" containerID="3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.006168 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585"} err="failed to get container status \"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585\": rpc error: code = NotFound desc = could not find container \"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585\": container with ID starting with 3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585 not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.006180 4742 scope.go:117] "RemoveContainer" containerID="d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.006323 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa"} err="failed to get container status \"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa\": rpc error: code = NotFound desc = could not find container \"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa\": container with ID starting with d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.006334 4742 scope.go:117] "RemoveContainer" containerID="bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.006473 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb"} err="failed to get container status \"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb\": rpc error: code = NotFound desc = could not find container \"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb\": container with ID starting with bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.006485 4742 scope.go:117] "RemoveContainer" containerID="f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.006648 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0"} err="failed to get container status \"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0\": rpc error: code = NotFound desc = could not find container \"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0\": container with ID starting with f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0 not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.006659 4742 scope.go:117] "RemoveContainer" containerID="3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.008633 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585"} err="failed to get container status \"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585\": rpc error: code = NotFound desc = could not find container \"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585\": container with ID starting with 3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585 not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.008672 4742 scope.go:117] "RemoveContainer" containerID="d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.008945 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa"} err="failed to get container status \"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa\": rpc error: code = NotFound desc = could not find container \"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa\": container with ID starting with d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.008959 4742 scope.go:117] "RemoveContainer" containerID="bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.013350 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb"} err="failed to get container status \"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb\": rpc error: code = NotFound desc = could not find container \"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb\": container with ID starting with bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.013390 4742 scope.go:117] "RemoveContainer" containerID="f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.013976 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0"} err="failed to get container status \"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0\": rpc error: code = NotFound desc = could not find container \"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0\": container with ID starting with f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0 not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.014003 4742 scope.go:117] "RemoveContainer" containerID="3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.014841 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585"} err="failed to get container status \"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585\": rpc error: code = NotFound desc = could not find container \"3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585\": container with ID starting with 3f8fc7d4a95ce073b978783bc9a7749a9fdf809f0f2bf4ec06d3b8e118887585 not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.014886 4742 scope.go:117] "RemoveContainer" containerID="d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.027536 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa"} err="failed to get container status \"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa\": rpc error: code = NotFound desc = could not find container \"d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa\": container with ID starting with d521aa3a89416bc5115f2dbba72a3cea576c311522c5a8a8fc0fc66db91c4aaa not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.027579 4742 scope.go:117] "RemoveContainer" containerID="bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.030568 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb"} err="failed to get container status \"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb\": rpc error: code = NotFound desc = could not find container \"bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb\": container with ID starting with bc7717d12d999862547b16e048d18f9b43462f57a6d591d778fdd3227bad7ecb not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.030618 4742 scope.go:117] "RemoveContainer" containerID="f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.030967 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0"} err="failed to get container status \"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0\": rpc error: code = NotFound desc = could not find container \"f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0\": container with ID starting with f1154361743c4661ad77e11d99ad24fd929825c0f14e281ce5a0117fb5877aa0 not found: ID does not exist" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.031321 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-nx2qq"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.035141 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nx2qq" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.039251 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536f4bea-32b3-4fd4-a576-b73a67d7ad23-operator-scripts\") pod \"nova-api-db-create-lsskf\" (UID: \"536f4bea-32b3-4fd4-a576-b73a67d7ad23\") " pod="openstack/nova-api-db-create-lsskf" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.039288 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlltg\" (UniqueName: \"kubernetes.io/projected/536f4bea-32b3-4fd4-a576-b73a67d7ad23-kube-api-access-xlltg\") pod \"nova-api-db-create-lsskf\" (UID: \"536f4bea-32b3-4fd4-a576-b73a67d7ad23\") " pod="openstack/nova-api-db-create-lsskf" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.039418 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.039430 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc51fde3-3dea-4f98-ac76-18d3a410fab7-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.052843 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nx2qq"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.067783 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-9a1d-account-create-update-fk254"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.069101 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9a1d-account-create-update-fk254" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.071814 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.140030 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9a1d-account-create-update-fk254"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.142650 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x9hz\" (UniqueName: \"kubernetes.io/projected/2c138c21-ff11-48af-9745-1e11b6b11467-kube-api-access-2x9hz\") pod \"nova-api-9a1d-account-create-update-fk254\" (UID: \"2c138c21-ff11-48af-9745-1e11b6b11467\") " pod="openstack/nova-api-9a1d-account-create-update-fk254" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.142705 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxlmz\" (UniqueName: \"kubernetes.io/projected/be226402-dc63-46e4-a635-670382b29013-kube-api-access-xxlmz\") pod \"nova-cell0-db-create-nx2qq\" (UID: \"be226402-dc63-46e4-a635-670382b29013\") " pod="openstack/nova-cell0-db-create-nx2qq" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.142775 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be226402-dc63-46e4-a635-670382b29013-operator-scripts\") pod \"nova-cell0-db-create-nx2qq\" (UID: \"be226402-dc63-46e4-a635-670382b29013\") " pod="openstack/nova-cell0-db-create-nx2qq" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.142800 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c138c21-ff11-48af-9745-1e11b6b11467-operator-scripts\") pod \"nova-api-9a1d-account-create-update-fk254\" (UID: \"2c138c21-ff11-48af-9745-1e11b6b11467\") " pod="openstack/nova-api-9a1d-account-create-update-fk254" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.142845 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536f4bea-32b3-4fd4-a576-b73a67d7ad23-operator-scripts\") pod \"nova-api-db-create-lsskf\" (UID: \"536f4bea-32b3-4fd4-a576-b73a67d7ad23\") " pod="openstack/nova-api-db-create-lsskf" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.142869 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlltg\" (UniqueName: \"kubernetes.io/projected/536f4bea-32b3-4fd4-a576-b73a67d7ad23-kube-api-access-xlltg\") pod \"nova-api-db-create-lsskf\" (UID: \"536f4bea-32b3-4fd4-a576-b73a67d7ad23\") " pod="openstack/nova-api-db-create-lsskf" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.143716 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536f4bea-32b3-4fd4-a576-b73a67d7ad23-operator-scripts\") pod \"nova-api-db-create-lsskf\" (UID: \"536f4bea-32b3-4fd4-a576-b73a67d7ad23\") " pod="openstack/nova-api-db-create-lsskf" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.182111 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.195783 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlltg\" (UniqueName: \"kubernetes.io/projected/536f4bea-32b3-4fd4-a576-b73a67d7ad23-kube-api-access-xlltg\") pod \"nova-api-db-create-lsskf\" (UID: \"536f4bea-32b3-4fd4-a576-b73a67d7ad23\") " pod="openstack/nova-api-db-create-lsskf" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.196435 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.229575 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.231750 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.234459 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.234719 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.245390 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x9hz\" (UniqueName: \"kubernetes.io/projected/2c138c21-ff11-48af-9745-1e11b6b11467-kube-api-access-2x9hz\") pod \"nova-api-9a1d-account-create-update-fk254\" (UID: \"2c138c21-ff11-48af-9745-1e11b6b11467\") " pod="openstack/nova-api-9a1d-account-create-update-fk254" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.245427 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxlmz\" (UniqueName: \"kubernetes.io/projected/be226402-dc63-46e4-a635-670382b29013-kube-api-access-xxlmz\") pod \"nova-cell0-db-create-nx2qq\" (UID: \"be226402-dc63-46e4-a635-670382b29013\") " pod="openstack/nova-cell0-db-create-nx2qq" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.245505 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be226402-dc63-46e4-a635-670382b29013-operator-scripts\") pod \"nova-cell0-db-create-nx2qq\" (UID: \"be226402-dc63-46e4-a635-670382b29013\") " pod="openstack/nova-cell0-db-create-nx2qq" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.245528 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c138c21-ff11-48af-9745-1e11b6b11467-operator-scripts\") pod \"nova-api-9a1d-account-create-update-fk254\" (UID: \"2c138c21-ff11-48af-9745-1e11b6b11467\") " pod="openstack/nova-api-9a1d-account-create-update-fk254" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.246166 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c138c21-ff11-48af-9745-1e11b6b11467-operator-scripts\") pod \"nova-api-9a1d-account-create-update-fk254\" (UID: \"2c138c21-ff11-48af-9745-1e11b6b11467\") " pod="openstack/nova-api-9a1d-account-create-update-fk254" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.246738 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be226402-dc63-46e4-a635-670382b29013-operator-scripts\") pod \"nova-cell0-db-create-nx2qq\" (UID: \"be226402-dc63-46e4-a635-670382b29013\") " pod="openstack/nova-cell0-db-create-nx2qq" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.260066 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.264423 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x9hz\" (UniqueName: \"kubernetes.io/projected/2c138c21-ff11-48af-9745-1e11b6b11467-kube-api-access-2x9hz\") pod \"nova-api-9a1d-account-create-update-fk254\" (UID: \"2c138c21-ff11-48af-9745-1e11b6b11467\") " pod="openstack/nova-api-9a1d-account-create-update-fk254" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.266877 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxlmz\" (UniqueName: \"kubernetes.io/projected/be226402-dc63-46e4-a635-670382b29013-kube-api-access-xxlmz\") pod \"nova-cell0-db-create-nx2qq\" (UID: \"be226402-dc63-46e4-a635-670382b29013\") " pod="openstack/nova-cell0-db-create-nx2qq" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.269526 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-vcls7"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.271031 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vcls7" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.272983 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lsskf" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.284048 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-66c1-account-create-update-98hrl"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.285357 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-66c1-account-create-update-98hrl" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.286931 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.305108 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vcls7"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.326759 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-66c1-account-create-update-98hrl"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.346819 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.346864 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fhz6\" (UniqueName: \"kubernetes.io/projected/1fce8ab7-2019-4d76-a2b5-003b5489bd87-kube-api-access-7fhz6\") pod \"nova-cell0-66c1-account-create-update-98hrl\" (UID: \"1fce8ab7-2019-4d76-a2b5-003b5489bd87\") " pod="openstack/nova-cell0-66c1-account-create-update-98hrl" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.346902 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a3f6205-139f-49cf-8262-1c17f5beb979-log-httpd\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.346972 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fce8ab7-2019-4d76-a2b5-003b5489bd87-operator-scripts\") pod \"nova-cell0-66c1-account-create-update-98hrl\" (UID: \"1fce8ab7-2019-4d76-a2b5-003b5489bd87\") " pod="openstack/nova-cell0-66c1-account-create-update-98hrl" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.347000 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5924\" (UniqueName: \"kubernetes.io/projected/0a3f6205-139f-49cf-8262-1c17f5beb979-kube-api-access-b5924\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.347018 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-scripts\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.347045 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.347106 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a3f6205-139f-49cf-8262-1c17f5beb979-run-httpd\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.347132 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/191435c1-53cc-4df8-97de-1c71c78d9595-operator-scripts\") pod \"nova-cell1-db-create-vcls7\" (UID: \"191435c1-53cc-4df8-97de-1c71c78d9595\") " pod="openstack/nova-cell1-db-create-vcls7" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.347160 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-config-data\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.347184 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rjht\" (UniqueName: \"kubernetes.io/projected/191435c1-53cc-4df8-97de-1c71c78d9595-kube-api-access-5rjht\") pod \"nova-cell1-db-create-vcls7\" (UID: \"191435c1-53cc-4df8-97de-1c71c78d9595\") " pod="openstack/nova-cell1-db-create-vcls7" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.391103 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.431827 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nx2qq" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.452257 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/191435c1-53cc-4df8-97de-1c71c78d9595-operator-scripts\") pod \"nova-cell1-db-create-vcls7\" (UID: \"191435c1-53cc-4df8-97de-1c71c78d9595\") " pod="openstack/nova-cell1-db-create-vcls7" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.452312 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-config-data\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.452346 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rjht\" (UniqueName: \"kubernetes.io/projected/191435c1-53cc-4df8-97de-1c71c78d9595-kube-api-access-5rjht\") pod \"nova-cell1-db-create-vcls7\" (UID: \"191435c1-53cc-4df8-97de-1c71c78d9595\") " pod="openstack/nova-cell1-db-create-vcls7" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.452375 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.452390 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fhz6\" (UniqueName: \"kubernetes.io/projected/1fce8ab7-2019-4d76-a2b5-003b5489bd87-kube-api-access-7fhz6\") pod \"nova-cell0-66c1-account-create-update-98hrl\" (UID: \"1fce8ab7-2019-4d76-a2b5-003b5489bd87\") " pod="openstack/nova-cell0-66c1-account-create-update-98hrl" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.452424 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a3f6205-139f-49cf-8262-1c17f5beb979-log-httpd\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.452483 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fce8ab7-2019-4d76-a2b5-003b5489bd87-operator-scripts\") pod \"nova-cell0-66c1-account-create-update-98hrl\" (UID: \"1fce8ab7-2019-4d76-a2b5-003b5489bd87\") " pod="openstack/nova-cell0-66c1-account-create-update-98hrl" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.452510 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5924\" (UniqueName: \"kubernetes.io/projected/0a3f6205-139f-49cf-8262-1c17f5beb979-kube-api-access-b5924\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.452527 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-scripts\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.452556 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.452629 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a3f6205-139f-49cf-8262-1c17f5beb979-run-httpd\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.453093 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a3f6205-139f-49cf-8262-1c17f5beb979-run-httpd\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.453398 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a3f6205-139f-49cf-8262-1c17f5beb979-log-httpd\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.453466 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/191435c1-53cc-4df8-97de-1c71c78d9595-operator-scripts\") pod \"nova-cell1-db-create-vcls7\" (UID: \"191435c1-53cc-4df8-97de-1c71c78d9595\") " pod="openstack/nova-cell1-db-create-vcls7" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.454141 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fce8ab7-2019-4d76-a2b5-003b5489bd87-operator-scripts\") pod \"nova-cell0-66c1-account-create-update-98hrl\" (UID: \"1fce8ab7-2019-4d76-a2b5-003b5489bd87\") " pod="openstack/nova-cell0-66c1-account-create-update-98hrl" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.457341 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9a1d-account-create-update-fk254" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.461264 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.463819 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-scripts\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.476925 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-config-data\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.477624 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-a76f-account-create-update-9nrf6"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.478859 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.480744 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rjht\" (UniqueName: \"kubernetes.io/projected/191435c1-53cc-4df8-97de-1c71c78d9595-kube-api-access-5rjht\") pod \"nova-cell1-db-create-vcls7\" (UID: \"191435c1-53cc-4df8-97de-1c71c78d9595\") " pod="openstack/nova-cell1-db-create-vcls7" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.480866 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.483457 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fhz6\" (UniqueName: \"kubernetes.io/projected/1fce8ab7-2019-4d76-a2b5-003b5489bd87-kube-api-access-7fhz6\") pod \"nova-cell0-66c1-account-create-update-98hrl\" (UID: \"1fce8ab7-2019-4d76-a2b5-003b5489bd87\") " pod="openstack/nova-cell0-66c1-account-create-update-98hrl" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.484279 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5924\" (UniqueName: \"kubernetes.io/projected/0a3f6205-139f-49cf-8262-1c17f5beb979-kube-api-access-b5924\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.487021 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.509951 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a76f-account-create-update-9nrf6"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.553992 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5e4c0a2-6132-410b-8740-6c9e171c7824-operator-scripts\") pod \"nova-cell1-a76f-account-create-update-9nrf6\" (UID: \"f5e4c0a2-6132-410b-8740-6c9e171c7824\") " pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.554081 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssn99\" (UniqueName: \"kubernetes.io/projected/f5e4c0a2-6132-410b-8740-6c9e171c7824-kube-api-access-ssn99\") pod \"nova-cell1-a76f-account-create-update-9nrf6\" (UID: \"f5e4c0a2-6132-410b-8740-6c9e171c7824\") " pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.561557 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.655894 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssn99\" (UniqueName: \"kubernetes.io/projected/f5e4c0a2-6132-410b-8740-6c9e171c7824-kube-api-access-ssn99\") pod \"nova-cell1-a76f-account-create-update-9nrf6\" (UID: \"f5e4c0a2-6132-410b-8740-6c9e171c7824\") " pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.656133 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5e4c0a2-6132-410b-8740-6c9e171c7824-operator-scripts\") pod \"nova-cell1-a76f-account-create-update-9nrf6\" (UID: \"f5e4c0a2-6132-410b-8740-6c9e171c7824\") " pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.656803 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5e4c0a2-6132-410b-8740-6c9e171c7824-operator-scripts\") pod \"nova-cell1-a76f-account-create-update-9nrf6\" (UID: \"f5e4c0a2-6132-410b-8740-6c9e171c7824\") " pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.673361 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssn99\" (UniqueName: \"kubernetes.io/projected/f5e4c0a2-6132-410b-8740-6c9e171c7824-kube-api-access-ssn99\") pod \"nova-cell1-a76f-account-create-update-9nrf6\" (UID: \"f5e4c0a2-6132-410b-8740-6c9e171c7824\") " pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.678052 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vcls7" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.695306 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-66c1-account-create-update-98hrl" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.792619 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lsskf"] Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.833319 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" Mar 17 11:33:39 crc kubenswrapper[4742]: I0317 11:33:39.917965 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"11e12da8-9e80-453f-bbbd-03d1346afe5b","Type":"ContainerStarted","Data":"7f928c939ad1f7f34a753aebf94aae4d310de9638f201fe03cd578db06870ea8"} Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.013933 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nx2qq"] Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.024225 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9a1d-account-create-update-fk254"] Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.294993 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.502892 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-66c1-account-create-update-98hrl"] Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.588551 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vcls7"] Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.613132 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a76f-account-create-update-9nrf6"] Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.680635 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc51fde3-3dea-4f98-ac76-18d3a410fab7" path="/var/lib/kubelet/pods/cc51fde3-3dea-4f98-ac76-18d3a410fab7/volumes" Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.934681 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vcls7" event={"ID":"191435c1-53cc-4df8-97de-1c71c78d9595","Type":"ContainerStarted","Data":"09bfd82acac1b94c1f73f7ca0c85c42677c88cfd225aac247dc1317905cfc092"} Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.934968 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vcls7" event={"ID":"191435c1-53cc-4df8-97de-1c71c78d9595","Type":"ContainerStarted","Data":"31fe40cf3aa0f04753e3d181d27128ac021c26086232a229bf97e8ec4934b525"} Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.940262 4742 generic.go:334] "Generic (PLEG): container finished" podID="be226402-dc63-46e4-a635-670382b29013" containerID="2d20c9a98cfb3056390dacb40419181cfa85f282a737dfffb38d5c2e64021e7c" exitCode=0 Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.940312 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nx2qq" event={"ID":"be226402-dc63-46e4-a635-670382b29013","Type":"ContainerDied","Data":"2d20c9a98cfb3056390dacb40419181cfa85f282a737dfffb38d5c2e64021e7c"} Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.940333 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nx2qq" event={"ID":"be226402-dc63-46e4-a635-670382b29013","Type":"ContainerStarted","Data":"2ea99ed05af978061713c600cc8ea4ebd5c80781c1229463054cef78d1adb1e3"} Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.942891 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a3f6205-139f-49cf-8262-1c17f5beb979","Type":"ContainerStarted","Data":"1a0212f85e806ff2cd888dcad3f60f44d856ab92e04244c98df17cb0abc0e84b"} Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.950784 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-66c1-account-create-update-98hrl" event={"ID":"1fce8ab7-2019-4d76-a2b5-003b5489bd87","Type":"ContainerStarted","Data":"88168536c116dbbdd920a498b31d10d9503492eb277131566703e79519b7835c"} Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.950839 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-66c1-account-create-update-98hrl" event={"ID":"1fce8ab7-2019-4d76-a2b5-003b5489bd87","Type":"ContainerStarted","Data":"3fb2431ee69d6dceb983a2ee00cd849cbc3784bdb37c6e7a29d14a7a67072138"} Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.952780 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-vcls7" podStartSLOduration=1.9527589920000001 podStartE2EDuration="1.952758992s" podCreationTimestamp="2026-03-17 11:33:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:33:40.950502869 +0000 UTC m=+1324.076630627" watchObservedRunningTime="2026-03-17 11:33:40.952758992 +0000 UTC m=+1324.078886750" Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.953265 4742 generic.go:334] "Generic (PLEG): container finished" podID="2c138c21-ff11-48af-9745-1e11b6b11467" containerID="ac3e76307d9ac2538e6ae31da71f62ad326944c8bae3edefc48f083629472d1c" exitCode=0 Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.953313 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9a1d-account-create-update-fk254" event={"ID":"2c138c21-ff11-48af-9745-1e11b6b11467","Type":"ContainerDied","Data":"ac3e76307d9ac2538e6ae31da71f62ad326944c8bae3edefc48f083629472d1c"} Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.953334 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9a1d-account-create-update-fk254" event={"ID":"2c138c21-ff11-48af-9745-1e11b6b11467","Type":"ContainerStarted","Data":"100739fc49ea60851bca2dd51d013fbabc357a0774a73940eda3f34fd9ab4615"} Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.955066 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" event={"ID":"f5e4c0a2-6132-410b-8740-6c9e171c7824","Type":"ContainerStarted","Data":"ed201381f67965df77d8cbea154884269fa05a627a0bd50f69090277b724b3ed"} Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.955094 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" event={"ID":"f5e4c0a2-6132-410b-8740-6c9e171c7824","Type":"ContainerStarted","Data":"00cc78dbf36dd1c9e9cf86444577371b2a7ad5f195b72852695fecae455923a3"} Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.957776 4742 generic.go:334] "Generic (PLEG): container finished" podID="536f4bea-32b3-4fd4-a576-b73a67d7ad23" containerID="4ff039c5f059f386c06e8e1cfdbffd1e2a65257ed35a387c1224c90134df023a" exitCode=0 Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.957812 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lsskf" event={"ID":"536f4bea-32b3-4fd4-a576-b73a67d7ad23","Type":"ContainerDied","Data":"4ff039c5f059f386c06e8e1cfdbffd1e2a65257ed35a387c1224c90134df023a"} Mar 17 11:33:40 crc kubenswrapper[4742]: I0317 11:33:40.957828 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lsskf" event={"ID":"536f4bea-32b3-4fd4-a576-b73a67d7ad23","Type":"ContainerStarted","Data":"dda6a15c8d70a23b3994d0f5b226334afc388d87628633195858b157474dd649"} Mar 17 11:33:41 crc kubenswrapper[4742]: I0317 11:33:41.026418 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" podStartSLOduration=2.026396223 podStartE2EDuration="2.026396223s" podCreationTimestamp="2026-03-17 11:33:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:33:40.990248566 +0000 UTC m=+1324.116376324" watchObservedRunningTime="2026-03-17 11:33:41.026396223 +0000 UTC m=+1324.152523971" Mar 17 11:33:41 crc kubenswrapper[4742]: I0317 11:33:41.047751 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-66c1-account-create-update-98hrl" podStartSLOduration=2.047735047 podStartE2EDuration="2.047735047s" podCreationTimestamp="2026-03-17 11:33:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:33:41.042504552 +0000 UTC m=+1324.168632310" watchObservedRunningTime="2026-03-17 11:33:41.047735047 +0000 UTC m=+1324.173862805" Mar 17 11:33:41 crc kubenswrapper[4742]: I0317 11:33:41.262056 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 17 11:33:41 crc kubenswrapper[4742]: I0317 11:33:41.977395 4742 generic.go:334] "Generic (PLEG): container finished" podID="f5e4c0a2-6132-410b-8740-6c9e171c7824" containerID="ed201381f67965df77d8cbea154884269fa05a627a0bd50f69090277b724b3ed" exitCode=0 Mar 17 11:33:41 crc kubenswrapper[4742]: I0317 11:33:41.977746 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" event={"ID":"f5e4c0a2-6132-410b-8740-6c9e171c7824","Type":"ContainerDied","Data":"ed201381f67965df77d8cbea154884269fa05a627a0bd50f69090277b724b3ed"} Mar 17 11:33:41 crc kubenswrapper[4742]: I0317 11:33:41.980384 4742 generic.go:334] "Generic (PLEG): container finished" podID="191435c1-53cc-4df8-97de-1c71c78d9595" containerID="09bfd82acac1b94c1f73f7ca0c85c42677c88cfd225aac247dc1317905cfc092" exitCode=0 Mar 17 11:33:41 crc kubenswrapper[4742]: I0317 11:33:41.980448 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vcls7" event={"ID":"191435c1-53cc-4df8-97de-1c71c78d9595","Type":"ContainerDied","Data":"09bfd82acac1b94c1f73f7ca0c85c42677c88cfd225aac247dc1317905cfc092"} Mar 17 11:33:41 crc kubenswrapper[4742]: I0317 11:33:41.982860 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a3f6205-139f-49cf-8262-1c17f5beb979","Type":"ContainerStarted","Data":"f252c9621aeb6f57540396f406c8024dabc2c53548f02f5d62bd6cbde906b7ec"} Mar 17 11:33:41 crc kubenswrapper[4742]: I0317 11:33:41.985173 4742 generic.go:334] "Generic (PLEG): container finished" podID="1fce8ab7-2019-4d76-a2b5-003b5489bd87" containerID="88168536c116dbbdd920a498b31d10d9503492eb277131566703e79519b7835c" exitCode=0 Mar 17 11:33:41 crc kubenswrapper[4742]: I0317 11:33:41.985448 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-66c1-account-create-update-98hrl" event={"ID":"1fce8ab7-2019-4d76-a2b5-003b5489bd87","Type":"ContainerDied","Data":"88168536c116dbbdd920a498b31d10d9503492eb277131566703e79519b7835c"} Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.117311 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5cbc75d594-mvhf5" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.452799 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lsskf" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.542212 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536f4bea-32b3-4fd4-a576-b73a67d7ad23-operator-scripts\") pod \"536f4bea-32b3-4fd4-a576-b73a67d7ad23\" (UID: \"536f4bea-32b3-4fd4-a576-b73a67d7ad23\") " Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.542698 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/536f4bea-32b3-4fd4-a576-b73a67d7ad23-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "536f4bea-32b3-4fd4-a576-b73a67d7ad23" (UID: "536f4bea-32b3-4fd4-a576-b73a67d7ad23"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.542775 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlltg\" (UniqueName: \"kubernetes.io/projected/536f4bea-32b3-4fd4-a576-b73a67d7ad23-kube-api-access-xlltg\") pod \"536f4bea-32b3-4fd4-a576-b73a67d7ad23\" (UID: \"536f4bea-32b3-4fd4-a576-b73a67d7ad23\") " Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.543203 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536f4bea-32b3-4fd4-a576-b73a67d7ad23-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.551204 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536f4bea-32b3-4fd4-a576-b73a67d7ad23-kube-api-access-xlltg" (OuterVolumeSpecName: "kube-api-access-xlltg") pod "536f4bea-32b3-4fd4-a576-b73a67d7ad23" (UID: "536f4bea-32b3-4fd4-a576-b73a67d7ad23"). InnerVolumeSpecName "kube-api-access-xlltg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.624472 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nx2qq" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.629533 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9a1d-account-create-update-fk254" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.644987 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlltg\" (UniqueName: \"kubernetes.io/projected/536f4bea-32b3-4fd4-a576-b73a67d7ad23-kube-api-access-xlltg\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.746256 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be226402-dc63-46e4-a635-670382b29013-operator-scripts\") pod \"be226402-dc63-46e4-a635-670382b29013\" (UID: \"be226402-dc63-46e4-a635-670382b29013\") " Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.746587 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c138c21-ff11-48af-9745-1e11b6b11467-operator-scripts\") pod \"2c138c21-ff11-48af-9745-1e11b6b11467\" (UID: \"2c138c21-ff11-48af-9745-1e11b6b11467\") " Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.747144 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c138c21-ff11-48af-9745-1e11b6b11467-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c138c21-ff11-48af-9745-1e11b6b11467" (UID: "2c138c21-ff11-48af-9745-1e11b6b11467"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.747142 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be226402-dc63-46e4-a635-670382b29013-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be226402-dc63-46e4-a635-670382b29013" (UID: "be226402-dc63-46e4-a635-670382b29013"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.747271 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxlmz\" (UniqueName: \"kubernetes.io/projected/be226402-dc63-46e4-a635-670382b29013-kube-api-access-xxlmz\") pod \"be226402-dc63-46e4-a635-670382b29013\" (UID: \"be226402-dc63-46e4-a635-670382b29013\") " Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.747336 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x9hz\" (UniqueName: \"kubernetes.io/projected/2c138c21-ff11-48af-9745-1e11b6b11467-kube-api-access-2x9hz\") pod \"2c138c21-ff11-48af-9745-1e11b6b11467\" (UID: \"2c138c21-ff11-48af-9745-1e11b6b11467\") " Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.748102 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be226402-dc63-46e4-a635-670382b29013-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.748121 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c138c21-ff11-48af-9745-1e11b6b11467-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.750363 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c138c21-ff11-48af-9745-1e11b6b11467-kube-api-access-2x9hz" (OuterVolumeSpecName: "kube-api-access-2x9hz") pod "2c138c21-ff11-48af-9745-1e11b6b11467" (UID: "2c138c21-ff11-48af-9745-1e11b6b11467"). InnerVolumeSpecName "kube-api-access-2x9hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.751109 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be226402-dc63-46e4-a635-670382b29013-kube-api-access-xxlmz" (OuterVolumeSpecName: "kube-api-access-xxlmz") pod "be226402-dc63-46e4-a635-670382b29013" (UID: "be226402-dc63-46e4-a635-670382b29013"). InnerVolumeSpecName "kube-api-access-xxlmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.850201 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x9hz\" (UniqueName: \"kubernetes.io/projected/2c138c21-ff11-48af-9745-1e11b6b11467-kube-api-access-2x9hz\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.850228 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxlmz\" (UniqueName: \"kubernetes.io/projected/be226402-dc63-46e4-a635-670382b29013-kube-api-access-xxlmz\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.997443 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nx2qq" event={"ID":"be226402-dc63-46e4-a635-670382b29013","Type":"ContainerDied","Data":"2ea99ed05af978061713c600cc8ea4ebd5c80781c1229463054cef78d1adb1e3"} Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.998397 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ea99ed05af978061713c600cc8ea4ebd5c80781c1229463054cef78d1adb1e3" Mar 17 11:33:42 crc kubenswrapper[4742]: I0317 11:33:42.997861 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nx2qq" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.004174 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a3f6205-139f-49cf-8262-1c17f5beb979","Type":"ContainerStarted","Data":"89624c1e11fa954ade4127695f9644194f90fd59f6d8b81c88167febb19ef04d"} Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.004286 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a3f6205-139f-49cf-8262-1c17f5beb979","Type":"ContainerStarted","Data":"a86eb43aa7d5bc9c0ad09a86de0916652eaf76218e4169f73f3380da347ed21e"} Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.012243 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9a1d-account-create-update-fk254" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.012234 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9a1d-account-create-update-fk254" event={"ID":"2c138c21-ff11-48af-9745-1e11b6b11467","Type":"ContainerDied","Data":"100739fc49ea60851bca2dd51d013fbabc357a0774a73940eda3f34fd9ab4615"} Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.012579 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="100739fc49ea60851bca2dd51d013fbabc357a0774a73940eda3f34fd9ab4615" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.015622 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lsskf" event={"ID":"536f4bea-32b3-4fd4-a576-b73a67d7ad23","Type":"ContainerDied","Data":"dda6a15c8d70a23b3994d0f5b226334afc388d87628633195858b157474dd649"} Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.015727 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dda6a15c8d70a23b3994d0f5b226334afc388d87628633195858b157474dd649" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.015840 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lsskf" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.431358 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-66c1-account-create-update-98hrl" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.565579 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fhz6\" (UniqueName: \"kubernetes.io/projected/1fce8ab7-2019-4d76-a2b5-003b5489bd87-kube-api-access-7fhz6\") pod \"1fce8ab7-2019-4d76-a2b5-003b5489bd87\" (UID: \"1fce8ab7-2019-4d76-a2b5-003b5489bd87\") " Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.565834 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fce8ab7-2019-4d76-a2b5-003b5489bd87-operator-scripts\") pod \"1fce8ab7-2019-4d76-a2b5-003b5489bd87\" (UID: \"1fce8ab7-2019-4d76-a2b5-003b5489bd87\") " Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.566422 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fce8ab7-2019-4d76-a2b5-003b5489bd87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1fce8ab7-2019-4d76-a2b5-003b5489bd87" (UID: "1fce8ab7-2019-4d76-a2b5-003b5489bd87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.566932 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vcls7" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.571731 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.575268 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fce8ab7-2019-4d76-a2b5-003b5489bd87-kube-api-access-7fhz6" (OuterVolumeSpecName: "kube-api-access-7fhz6") pod "1fce8ab7-2019-4d76-a2b5-003b5489bd87" (UID: "1fce8ab7-2019-4d76-a2b5-003b5489bd87"). InnerVolumeSpecName "kube-api-access-7fhz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.669437 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rjht\" (UniqueName: \"kubernetes.io/projected/191435c1-53cc-4df8-97de-1c71c78d9595-kube-api-access-5rjht\") pod \"191435c1-53cc-4df8-97de-1c71c78d9595\" (UID: \"191435c1-53cc-4df8-97de-1c71c78d9595\") " Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.669595 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5e4c0a2-6132-410b-8740-6c9e171c7824-operator-scripts\") pod \"f5e4c0a2-6132-410b-8740-6c9e171c7824\" (UID: \"f5e4c0a2-6132-410b-8740-6c9e171c7824\") " Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.669674 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssn99\" (UniqueName: \"kubernetes.io/projected/f5e4c0a2-6132-410b-8740-6c9e171c7824-kube-api-access-ssn99\") pod \"f5e4c0a2-6132-410b-8740-6c9e171c7824\" (UID: \"f5e4c0a2-6132-410b-8740-6c9e171c7824\") " Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.669718 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/191435c1-53cc-4df8-97de-1c71c78d9595-operator-scripts\") pod \"191435c1-53cc-4df8-97de-1c71c78d9595\" (UID: \"191435c1-53cc-4df8-97de-1c71c78d9595\") " Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.670188 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fce8ab7-2019-4d76-a2b5-003b5489bd87-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.670206 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fhz6\" (UniqueName: \"kubernetes.io/projected/1fce8ab7-2019-4d76-a2b5-003b5489bd87-kube-api-access-7fhz6\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.670384 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/191435c1-53cc-4df8-97de-1c71c78d9595-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "191435c1-53cc-4df8-97de-1c71c78d9595" (UID: "191435c1-53cc-4df8-97de-1c71c78d9595"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.670519 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5e4c0a2-6132-410b-8740-6c9e171c7824-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f5e4c0a2-6132-410b-8740-6c9e171c7824" (UID: "f5e4c0a2-6132-410b-8740-6c9e171c7824"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.678129 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/191435c1-53cc-4df8-97de-1c71c78d9595-kube-api-access-5rjht" (OuterVolumeSpecName: "kube-api-access-5rjht") pod "191435c1-53cc-4df8-97de-1c71c78d9595" (UID: "191435c1-53cc-4df8-97de-1c71c78d9595"). InnerVolumeSpecName "kube-api-access-5rjht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.678183 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5e4c0a2-6132-410b-8740-6c9e171c7824-kube-api-access-ssn99" (OuterVolumeSpecName: "kube-api-access-ssn99") pod "f5e4c0a2-6132-410b-8740-6c9e171c7824" (UID: "f5e4c0a2-6132-410b-8740-6c9e171c7824"). InnerVolumeSpecName "kube-api-access-ssn99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.772827 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rjht\" (UniqueName: \"kubernetes.io/projected/191435c1-53cc-4df8-97de-1c71c78d9595-kube-api-access-5rjht\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.772869 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5e4c0a2-6132-410b-8740-6c9e171c7824-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.772883 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssn99\" (UniqueName: \"kubernetes.io/projected/f5e4c0a2-6132-410b-8740-6c9e171c7824-kube-api-access-ssn99\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:43 crc kubenswrapper[4742]: I0317 11:33:43.772895 4742 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/191435c1-53cc-4df8-97de-1c71c78d9595-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.024391 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-66c1-account-create-update-98hrl" event={"ID":"1fce8ab7-2019-4d76-a2b5-003b5489bd87","Type":"ContainerDied","Data":"3fb2431ee69d6dceb983a2ee00cd849cbc3784bdb37c6e7a29d14a7a67072138"} Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.024430 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fb2431ee69d6dceb983a2ee00cd849cbc3784bdb37c6e7a29d14a7a67072138" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.026549 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" event={"ID":"f5e4c0a2-6132-410b-8740-6c9e171c7824","Type":"ContainerDied","Data":"00cc78dbf36dd1c9e9cf86444577371b2a7ad5f195b72852695fecae455923a3"} Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.026575 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00cc78dbf36dd1c9e9cf86444577371b2a7ad5f195b72852695fecae455923a3" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.026675 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a76f-account-create-update-9nrf6" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.027244 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-66c1-account-create-update-98hrl" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.031335 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vcls7" event={"ID":"191435c1-53cc-4df8-97de-1c71c78d9595","Type":"ContainerDied","Data":"31fe40cf3aa0f04753e3d181d27128ac021c26086232a229bf97e8ec4934b525"} Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.031374 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31fe40cf3aa0f04753e3d181d27128ac021c26086232a229bf97e8ec4934b525" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.031408 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vcls7" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.520435 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bq6vr"] Mar 17 11:33:44 crc kubenswrapper[4742]: E0317 11:33:44.520859 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be226402-dc63-46e4-a635-670382b29013" containerName="mariadb-database-create" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.520875 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="be226402-dc63-46e4-a635-670382b29013" containerName="mariadb-database-create" Mar 17 11:33:44 crc kubenswrapper[4742]: E0317 11:33:44.520883 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536f4bea-32b3-4fd4-a576-b73a67d7ad23" containerName="mariadb-database-create" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.520889 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="536f4bea-32b3-4fd4-a576-b73a67d7ad23" containerName="mariadb-database-create" Mar 17 11:33:44 crc kubenswrapper[4742]: E0317 11:33:44.520914 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fce8ab7-2019-4d76-a2b5-003b5489bd87" containerName="mariadb-account-create-update" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.520920 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fce8ab7-2019-4d76-a2b5-003b5489bd87" containerName="mariadb-account-create-update" Mar 17 11:33:44 crc kubenswrapper[4742]: E0317 11:33:44.520929 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5e4c0a2-6132-410b-8740-6c9e171c7824" containerName="mariadb-account-create-update" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.520935 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5e4c0a2-6132-410b-8740-6c9e171c7824" containerName="mariadb-account-create-update" Mar 17 11:33:44 crc kubenswrapper[4742]: E0317 11:33:44.520954 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c138c21-ff11-48af-9745-1e11b6b11467" containerName="mariadb-account-create-update" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.520960 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c138c21-ff11-48af-9745-1e11b6b11467" containerName="mariadb-account-create-update" Mar 17 11:33:44 crc kubenswrapper[4742]: E0317 11:33:44.520967 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="191435c1-53cc-4df8-97de-1c71c78d9595" containerName="mariadb-database-create" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.521007 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="191435c1-53cc-4df8-97de-1c71c78d9595" containerName="mariadb-database-create" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.521160 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c138c21-ff11-48af-9745-1e11b6b11467" containerName="mariadb-account-create-update" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.521170 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5e4c0a2-6132-410b-8740-6c9e171c7824" containerName="mariadb-account-create-update" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.521184 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fce8ab7-2019-4d76-a2b5-003b5489bd87" containerName="mariadb-account-create-update" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.521198 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="191435c1-53cc-4df8-97de-1c71c78d9595" containerName="mariadb-database-create" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.521211 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="536f4bea-32b3-4fd4-a576-b73a67d7ad23" containerName="mariadb-database-create" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.521222 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="be226402-dc63-46e4-a635-670382b29013" containerName="mariadb-database-create" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.521816 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.524429 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.524613 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.524631 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4tjcv" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.545179 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bq6vr"] Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.589024 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2f8w\" (UniqueName: \"kubernetes.io/projected/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-kube-api-access-g2f8w\") pod \"nova-cell0-conductor-db-sync-bq6vr\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.589126 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-scripts\") pod \"nova-cell0-conductor-db-sync-bq6vr\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.589196 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bq6vr\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.589221 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-config-data\") pod \"nova-cell0-conductor-db-sync-bq6vr\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.691202 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2f8w\" (UniqueName: \"kubernetes.io/projected/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-kube-api-access-g2f8w\") pod \"nova-cell0-conductor-db-sync-bq6vr\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.691312 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-scripts\") pod \"nova-cell0-conductor-db-sync-bq6vr\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.691358 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bq6vr\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.691383 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-config-data\") pod \"nova-cell0-conductor-db-sync-bq6vr\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.696515 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-scripts\") pod \"nova-cell0-conductor-db-sync-bq6vr\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.696680 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-bq6vr\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.698856 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-config-data\") pod \"nova-cell0-conductor-db-sync-bq6vr\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.706637 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2f8w\" (UniqueName: \"kubernetes.io/projected/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-kube-api-access-g2f8w\") pod \"nova-cell0-conductor-db-sync-bq6vr\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:44 crc kubenswrapper[4742]: I0317 11:33:44.845470 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:33:45 crc kubenswrapper[4742]: I0317 11:33:45.041808 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a3f6205-139f-49cf-8262-1c17f5beb979","Type":"ContainerStarted","Data":"489752cf5030bf150af46802e221606adff1081019dec2652f3caa0bc0eb3323"} Mar 17 11:33:45 crc kubenswrapper[4742]: I0317 11:33:45.041987 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 11:33:45 crc kubenswrapper[4742]: I0317 11:33:45.073198 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.803754853 podStartE2EDuration="6.073181803s" podCreationTimestamp="2026-03-17 11:33:39 +0000 UTC" firstStartedPulling="2026-03-17 11:33:40.317251812 +0000 UTC m=+1323.443379570" lastFinishedPulling="2026-03-17 11:33:44.586678752 +0000 UTC m=+1327.712806520" observedRunningTime="2026-03-17 11:33:45.065390297 +0000 UTC m=+1328.191518055" watchObservedRunningTime="2026-03-17 11:33:45.073181803 +0000 UTC m=+1328.199309561" Mar 17 11:33:47 crc kubenswrapper[4742]: I0317 11:33:47.406375 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:47 crc kubenswrapper[4742]: I0317 11:33:47.407526 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="ceilometer-central-agent" containerID="cri-o://f252c9621aeb6f57540396f406c8024dabc2c53548f02f5d62bd6cbde906b7ec" gracePeriod=30 Mar 17 11:33:47 crc kubenswrapper[4742]: I0317 11:33:47.408065 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="proxy-httpd" containerID="cri-o://489752cf5030bf150af46802e221606adff1081019dec2652f3caa0bc0eb3323" gracePeriod=30 Mar 17 11:33:47 crc kubenswrapper[4742]: I0317 11:33:47.408142 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="sg-core" containerID="cri-o://89624c1e11fa954ade4127695f9644194f90fd59f6d8b81c88167febb19ef04d" gracePeriod=30 Mar 17 11:33:47 crc kubenswrapper[4742]: I0317 11:33:47.408198 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="ceilometer-notification-agent" containerID="cri-o://a86eb43aa7d5bc9c0ad09a86de0916652eaf76218e4169f73f3380da347ed21e" gracePeriod=30 Mar 17 11:33:47 crc kubenswrapper[4742]: I0317 11:33:47.460245 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:47 crc kubenswrapper[4742]: I0317 11:33:47.462501 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-c96b95bb7-ckpvc" Mar 17 11:33:48 crc kubenswrapper[4742]: I0317 11:33:48.082172 4742 generic.go:334] "Generic (PLEG): container finished" podID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerID="489752cf5030bf150af46802e221606adff1081019dec2652f3caa0bc0eb3323" exitCode=0 Mar 17 11:33:48 crc kubenswrapper[4742]: I0317 11:33:48.082597 4742 generic.go:334] "Generic (PLEG): container finished" podID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerID="89624c1e11fa954ade4127695f9644194f90fd59f6d8b81c88167febb19ef04d" exitCode=2 Mar 17 11:33:48 crc kubenswrapper[4742]: I0317 11:33:48.082257 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a3f6205-139f-49cf-8262-1c17f5beb979","Type":"ContainerDied","Data":"489752cf5030bf150af46802e221606adff1081019dec2652f3caa0bc0eb3323"} Mar 17 11:33:48 crc kubenswrapper[4742]: I0317 11:33:48.082663 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a3f6205-139f-49cf-8262-1c17f5beb979","Type":"ContainerDied","Data":"89624c1e11fa954ade4127695f9644194f90fd59f6d8b81c88167febb19ef04d"} Mar 17 11:33:48 crc kubenswrapper[4742]: I0317 11:33:48.082681 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a3f6205-139f-49cf-8262-1c17f5beb979","Type":"ContainerDied","Data":"a86eb43aa7d5bc9c0ad09a86de0916652eaf76218e4169f73f3380da347ed21e"} Mar 17 11:33:48 crc kubenswrapper[4742]: I0317 11:33:48.082615 4742 generic.go:334] "Generic (PLEG): container finished" podID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerID="a86eb43aa7d5bc9c0ad09a86de0916652eaf76218e4169f73f3380da347ed21e" exitCode=0 Mar 17 11:33:48 crc kubenswrapper[4742]: I0317 11:33:48.082708 4742 generic.go:334] "Generic (PLEG): container finished" podID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerID="f252c9621aeb6f57540396f406c8024dabc2c53548f02f5d62bd6cbde906b7ec" exitCode=0 Mar 17 11:33:48 crc kubenswrapper[4742]: I0317 11:33:48.082815 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a3f6205-139f-49cf-8262-1c17f5beb979","Type":"ContainerDied","Data":"f252c9621aeb6f57540396f406c8024dabc2c53548f02f5d62bd6cbde906b7ec"} Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.714847 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.805584 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-scripts\") pod \"0a3f6205-139f-49cf-8262-1c17f5beb979\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.805843 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5924\" (UniqueName: \"kubernetes.io/projected/0a3f6205-139f-49cf-8262-1c17f5beb979-kube-api-access-b5924\") pod \"0a3f6205-139f-49cf-8262-1c17f5beb979\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.805865 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-config-data\") pod \"0a3f6205-139f-49cf-8262-1c17f5beb979\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.805898 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-sg-core-conf-yaml\") pod \"0a3f6205-139f-49cf-8262-1c17f5beb979\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.805941 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a3f6205-139f-49cf-8262-1c17f5beb979-run-httpd\") pod \"0a3f6205-139f-49cf-8262-1c17f5beb979\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.806019 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a3f6205-139f-49cf-8262-1c17f5beb979-log-httpd\") pod \"0a3f6205-139f-49cf-8262-1c17f5beb979\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.806047 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-combined-ca-bundle\") pod \"0a3f6205-139f-49cf-8262-1c17f5beb979\" (UID: \"0a3f6205-139f-49cf-8262-1c17f5beb979\") " Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.806375 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3f6205-139f-49cf-8262-1c17f5beb979-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0a3f6205-139f-49cf-8262-1c17f5beb979" (UID: "0a3f6205-139f-49cf-8262-1c17f5beb979"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.806482 4742 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a3f6205-139f-49cf-8262-1c17f5beb979-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.806521 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3f6205-139f-49cf-8262-1c17f5beb979-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0a3f6205-139f-49cf-8262-1c17f5beb979" (UID: "0a3f6205-139f-49cf-8262-1c17f5beb979"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.811172 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-scripts" (OuterVolumeSpecName: "scripts") pod "0a3f6205-139f-49cf-8262-1c17f5beb979" (UID: "0a3f6205-139f-49cf-8262-1c17f5beb979"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.812070 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3f6205-139f-49cf-8262-1c17f5beb979-kube-api-access-b5924" (OuterVolumeSpecName: "kube-api-access-b5924") pod "0a3f6205-139f-49cf-8262-1c17f5beb979" (UID: "0a3f6205-139f-49cf-8262-1c17f5beb979"). InnerVolumeSpecName "kube-api-access-b5924". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.834581 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0a3f6205-139f-49cf-8262-1c17f5beb979" (UID: "0a3f6205-139f-49cf-8262-1c17f5beb979"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.883190 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a3f6205-139f-49cf-8262-1c17f5beb979" (UID: "0a3f6205-139f-49cf-8262-1c17f5beb979"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.910533 4742 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a3f6205-139f-49cf-8262-1c17f5beb979-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.910607 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.910636 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.910661 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5924\" (UniqueName: \"kubernetes.io/projected/0a3f6205-139f-49cf-8262-1c17f5beb979-kube-api-access-b5924\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.910688 4742 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.934721 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bq6vr"] Mar 17 11:33:50 crc kubenswrapper[4742]: I0317 11:33:50.953710 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-config-data" (OuterVolumeSpecName: "config-data") pod "0a3f6205-139f-49cf-8262-1c17f5beb979" (UID: "0a3f6205-139f-49cf-8262-1c17f5beb979"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.011978 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a3f6205-139f-49cf-8262-1c17f5beb979-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.108939 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"11e12da8-9e80-453f-bbbd-03d1346afe5b","Type":"ContainerStarted","Data":"436c798787ec0a4e95f916a541c8806d5c4697c3e383d52d43bcdd0048270078"} Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.112126 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a3f6205-139f-49cf-8262-1c17f5beb979","Type":"ContainerDied","Data":"1a0212f85e806ff2cd888dcad3f60f44d856ab92e04244c98df17cb0abc0e84b"} Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.112177 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.112186 4742 scope.go:117] "RemoveContainer" containerID="489752cf5030bf150af46802e221606adff1081019dec2652f3caa0bc0eb3323" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.116177 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bq6vr" event={"ID":"e15fe5ee-73d7-415a-a61c-a0e67d085f3a","Type":"ContainerStarted","Data":"3175413b26ce181e380f91de13370c43edda132ccc71b62920fb43e84d575e12"} Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.121835 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.107211362 podStartE2EDuration="13.121814284s" podCreationTimestamp="2026-03-17 11:33:38 +0000 UTC" firstStartedPulling="2026-03-17 11:33:39.400230159 +0000 UTC m=+1322.526357917" lastFinishedPulling="2026-03-17 11:33:50.414833071 +0000 UTC m=+1333.540960839" observedRunningTime="2026-03-17 11:33:51.120927869 +0000 UTC m=+1334.247055627" watchObservedRunningTime="2026-03-17 11:33:51.121814284 +0000 UTC m=+1334.247942052" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.141361 4742 scope.go:117] "RemoveContainer" containerID="89624c1e11fa954ade4127695f9644194f90fd59f6d8b81c88167febb19ef04d" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.166105 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.180033 4742 scope.go:117] "RemoveContainer" containerID="a86eb43aa7d5bc9c0ad09a86de0916652eaf76218e4169f73f3380da347ed21e" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.180609 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.192087 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:51 crc kubenswrapper[4742]: E0317 11:33:51.192461 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="ceilometer-central-agent" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.192473 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="ceilometer-central-agent" Mar 17 11:33:51 crc kubenswrapper[4742]: E0317 11:33:51.192487 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="proxy-httpd" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.192493 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="proxy-httpd" Mar 17 11:33:51 crc kubenswrapper[4742]: E0317 11:33:51.192511 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="ceilometer-notification-agent" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.192517 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="ceilometer-notification-agent" Mar 17 11:33:51 crc kubenswrapper[4742]: E0317 11:33:51.192528 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="sg-core" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.192534 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="sg-core" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.192675 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="proxy-httpd" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.192686 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="sg-core" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.192700 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="ceilometer-central-agent" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.192709 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" containerName="ceilometer-notification-agent" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.194324 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.198530 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.198700 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.203756 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.207663 4742 scope.go:117] "RemoveContainer" containerID="f252c9621aeb6f57540396f406c8024dabc2c53548f02f5d62bd6cbde906b7ec" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.317076 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.317134 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-log-httpd\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.317168 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-run-httpd\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.317205 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6lxc\" (UniqueName: \"kubernetes.io/projected/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-kube-api-access-q6lxc\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.317247 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-scripts\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.317270 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.317319 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-config-data\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.418612 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.418994 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-log-httpd\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.419037 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-run-httpd\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.419069 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6lxc\" (UniqueName: \"kubernetes.io/projected/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-kube-api-access-q6lxc\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.419117 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-scripts\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.419135 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.419197 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-config-data\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.419751 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-run-httpd\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.420160 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-log-httpd\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.423754 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-scripts\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.424099 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.424186 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.425127 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-config-data\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.446349 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6lxc\" (UniqueName: \"kubernetes.io/projected/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-kube-api-access-q6lxc\") pod \"ceilometer-0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " pod="openstack/ceilometer-0" Mar 17 11:33:51 crc kubenswrapper[4742]: I0317 11:33:51.514122 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:33:52 crc kubenswrapper[4742]: I0317 11:33:52.009441 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:33:52 crc kubenswrapper[4742]: I0317 11:33:52.024102 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-c7d48c699-86xxh" Mar 17 11:33:52 crc kubenswrapper[4742]: I0317 11:33:52.095164 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9d44b9d7b-r5znz"] Mar 17 11:33:52 crc kubenswrapper[4742]: I0317 11:33:52.095500 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-9d44b9d7b-r5znz" podUID="69896e76-e60c-4941-b013-a702791923ec" containerName="neutron-api" containerID="cri-o://831de658b40e525dec4242bdf49d13b3b0065da3ec5d4638a8e9dc7531098e33" gracePeriod=30 Mar 17 11:33:52 crc kubenswrapper[4742]: I0317 11:33:52.096124 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-9d44b9d7b-r5znz" podUID="69896e76-e60c-4941-b013-a702791923ec" containerName="neutron-httpd" containerID="cri-o://20d2c94ecc394709e98a453bbe007a230f2c3d20068fb621f4384bce882f1336" gracePeriod=30 Mar 17 11:33:52 crc kubenswrapper[4742]: I0317 11:33:52.117401 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5cbc75d594-mvhf5" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Mar 17 11:33:52 crc kubenswrapper[4742]: I0317 11:33:52.117499 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:33:52 crc kubenswrapper[4742]: I0317 11:33:52.131133 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25974d2f-0187-46fd-8d7a-4ac3c5b555d0","Type":"ContainerStarted","Data":"2c83fad22fa63229c300407650b27b1a64e7842bfefe6fb3fb3628d75b048074"} Mar 17 11:33:52 crc kubenswrapper[4742]: I0317 11:33:52.685708 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a3f6205-139f-49cf-8262-1c17f5beb979" path="/var/lib/kubelet/pods/0a3f6205-139f-49cf-8262-1c17f5beb979/volumes" Mar 17 11:33:53 crc kubenswrapper[4742]: I0317 11:33:53.150783 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25974d2f-0187-46fd-8d7a-4ac3c5b555d0","Type":"ContainerStarted","Data":"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57"} Mar 17 11:33:53 crc kubenswrapper[4742]: I0317 11:33:53.153930 4742 generic.go:334] "Generic (PLEG): container finished" podID="69896e76-e60c-4941-b013-a702791923ec" containerID="20d2c94ecc394709e98a453bbe007a230f2c3d20068fb621f4384bce882f1336" exitCode=0 Mar 17 11:33:53 crc kubenswrapper[4742]: I0317 11:33:53.153955 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d44b9d7b-r5znz" event={"ID":"69896e76-e60c-4941-b013-a702791923ec","Type":"ContainerDied","Data":"20d2c94ecc394709e98a453bbe007a230f2c3d20068fb621f4384bce882f1336"} Mar 17 11:33:54 crc kubenswrapper[4742]: I0317 11:33:54.165554 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25974d2f-0187-46fd-8d7a-4ac3c5b555d0","Type":"ContainerStarted","Data":"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26"} Mar 17 11:33:55 crc kubenswrapper[4742]: I0317 11:33:55.186283 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25974d2f-0187-46fd-8d7a-4ac3c5b555d0","Type":"ContainerStarted","Data":"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1"} Mar 17 11:33:55 crc kubenswrapper[4742]: I0317 11:33:55.191670 4742 generic.go:334] "Generic (PLEG): container finished" podID="69896e76-e60c-4941-b013-a702791923ec" containerID="831de658b40e525dec4242bdf49d13b3b0065da3ec5d4638a8e9dc7531098e33" exitCode=0 Mar 17 11:33:55 crc kubenswrapper[4742]: I0317 11:33:55.191705 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d44b9d7b-r5znz" event={"ID":"69896e76-e60c-4941-b013-a702791923ec","Type":"ContainerDied","Data":"831de658b40e525dec4242bdf49d13b3b0065da3ec5d4638a8e9dc7531098e33"} Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.875399 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod536f4bea_32b3_4fd4_a576_b73a67d7ad23.slice/crio-4ff039c5f059f386c06e8e1cfdbffd1e2a65257ed35a387c1224c90134df023a.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod536f4bea_32b3_4fd4_a576_b73a67d7ad23.slice/crio-4ff039c5f059f386c06e8e1cfdbffd1e2a65257ed35a387c1224c90134df023a.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.875828 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-1a0212f85e806ff2cd888dcad3f60f44d856ab92e04244c98df17cb0abc0e84b": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-1a0212f85e806ff2cd888dcad3f60f44d856ab92e04244c98df17cb0abc0e84b: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.875878 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fce8ab7_2019_4d76_a2b5_003b5489bd87.slice/crio-3fb2431ee69d6dceb983a2ee00cd849cbc3784bdb37c6e7a29d14a7a67072138": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fce8ab7_2019_4d76_a2b5_003b5489bd87.slice/crio-3fb2431ee69d6dceb983a2ee00cd849cbc3784bdb37c6e7a29d14a7a67072138: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.875956 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod191435c1_53cc_4df8_97de_1c71c78d9595.slice/crio-31fe40cf3aa0f04753e3d181d27128ac021c26086232a229bf97e8ec4934b525": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod191435c1_53cc_4df8_97de_1c71c78d9595.slice/crio-31fe40cf3aa0f04753e3d181d27128ac021c26086232a229bf97e8ec4934b525: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.875987 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5e4c0a2_6132_410b_8740_6c9e171c7824.slice/crio-00cc78dbf36dd1c9e9cf86444577371b2a7ad5f195b72852695fecae455923a3": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5e4c0a2_6132_410b_8740_6c9e171c7824.slice/crio-00cc78dbf36dd1c9e9cf86444577371b2a7ad5f195b72852695fecae455923a3: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.876001 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fce8ab7_2019_4d76_a2b5_003b5489bd87.slice/crio-conmon-88168536c116dbbdd920a498b31d10d9503492eb277131566703e79519b7835c.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fce8ab7_2019_4d76_a2b5_003b5489bd87.slice/crio-conmon-88168536c116dbbdd920a498b31d10d9503492eb277131566703e79519b7835c.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.876018 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod191435c1_53cc_4df8_97de_1c71c78d9595.slice/crio-conmon-09bfd82acac1b94c1f73f7ca0c85c42677c88cfd225aac247dc1317905cfc092.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod191435c1_53cc_4df8_97de_1c71c78d9595.slice/crio-conmon-09bfd82acac1b94c1f73f7ca0c85c42677c88cfd225aac247dc1317905cfc092.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.876032 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod191435c1_53cc_4df8_97de_1c71c78d9595.slice/crio-09bfd82acac1b94c1f73f7ca0c85c42677c88cfd225aac247dc1317905cfc092.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod191435c1_53cc_4df8_97de_1c71c78d9595.slice/crio-09bfd82acac1b94c1f73f7ca0c85c42677c88cfd225aac247dc1317905cfc092.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.876070 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fce8ab7_2019_4d76_a2b5_003b5489bd87.slice/crio-88168536c116dbbdd920a498b31d10d9503492eb277131566703e79519b7835c.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fce8ab7_2019_4d76_a2b5_003b5489bd87.slice/crio-88168536c116dbbdd920a498b31d10d9503492eb277131566703e79519b7835c.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.876210 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5e4c0a2_6132_410b_8740_6c9e171c7824.slice/crio-conmon-ed201381f67965df77d8cbea154884269fa05a627a0bd50f69090277b724b3ed.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5e4c0a2_6132_410b_8740_6c9e171c7824.slice/crio-conmon-ed201381f67965df77d8cbea154884269fa05a627a0bd50f69090277b724b3ed.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.876239 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5e4c0a2_6132_410b_8740_6c9e171c7824.slice/crio-ed201381f67965df77d8cbea154884269fa05a627a0bd50f69090277b724b3ed.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5e4c0a2_6132_410b_8740_6c9e171c7824.slice/crio-ed201381f67965df77d8cbea154884269fa05a627a0bd50f69090277b724b3ed.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.876418 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-conmon-f252c9621aeb6f57540396f406c8024dabc2c53548f02f5d62bd6cbde906b7ec.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-conmon-f252c9621aeb6f57540396f406c8024dabc2c53548f02f5d62bd6cbde906b7ec.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.876452 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-f252c9621aeb6f57540396f406c8024dabc2c53548f02f5d62bd6cbde906b7ec.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-f252c9621aeb6f57540396f406c8024dabc2c53548f02f5d62bd6cbde906b7ec.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.876514 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-conmon-a86eb43aa7d5bc9c0ad09a86de0916652eaf76218e4169f73f3380da347ed21e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-conmon-a86eb43aa7d5bc9c0ad09a86de0916652eaf76218e4169f73f3380da347ed21e.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.876565 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-a86eb43aa7d5bc9c0ad09a86de0916652eaf76218e4169f73f3380da347ed21e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-a86eb43aa7d5bc9c0ad09a86de0916652eaf76218e4169f73f3380da347ed21e.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.876650 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-conmon-89624c1e11fa954ade4127695f9644194f90fd59f6d8b81c88167febb19ef04d.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-conmon-89624c1e11fa954ade4127695f9644194f90fd59f6d8b81c88167febb19ef04d.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.876693 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-89624c1e11fa954ade4127695f9644194f90fd59f6d8b81c88167febb19ef04d.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-89624c1e11fa954ade4127695f9644194f90fd59f6d8b81c88167febb19ef04d.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.882840 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-conmon-489752cf5030bf150af46802e221606adff1081019dec2652f3caa0bc0eb3323.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-conmon-489752cf5030bf150af46802e221606adff1081019dec2652f3caa0bc0eb3323.scope: no such file or directory Mar 17 11:33:56 crc kubenswrapper[4742]: W0317 11:33:56.882878 4742 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-489752cf5030bf150af46802e221606adff1081019dec2652f3caa0bc0eb3323.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a3f6205_139f_49cf_8262_1c17f5beb979.slice/crio-489752cf5030bf150af46802e221606adff1081019dec2652f3caa0bc0eb3323.scope: no such file or directory Mar 17 11:33:57 crc kubenswrapper[4742]: I0317 11:33:57.207834 4742 generic.go:334] "Generic (PLEG): container finished" podID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerID="ae60630080e9cc570d98471ac0c31f7d1dbbc55f4cd0bf7020ef89cc50cc5e24" exitCode=137 Mar 17 11:33:57 crc kubenswrapper[4742]: I0317 11:33:57.207872 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbc75d594-mvhf5" event={"ID":"f2bbef92-cd02-42d8-b81d-ab7248e29328","Type":"ContainerDied","Data":"ae60630080e9cc570d98471ac0c31f7d1dbbc55f4cd0bf7020ef89cc50cc5e24"} Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.037917 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.038592 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5d937ab3-6dfb-4b0c-a846-da4820bad05f" containerName="glance-log" containerID="cri-o://b16a1e0dea5c35e5a713c52318aa554fbd5a170e6b1beda799e52b7f33cd1c7b" gracePeriod=30 Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.038995 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5d937ab3-6dfb-4b0c-a846-da4820bad05f" containerName="glance-httpd" containerID="cri-o://471b1b0b9bc05fd61003bdefb4957ca8755e2dd3462d5fa147269aa7e6143ecf" gracePeriod=30 Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.122355 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.126943 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6976ff4586-bgqjp" Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.150400 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562454-s7z5z"] Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.151668 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562454-s7z5z" Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.154363 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.154400 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.154510 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.190789 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562454-s7z5z"] Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.250310 4742 generic.go:334] "Generic (PLEG): container finished" podID="5d937ab3-6dfb-4b0c-a846-da4820bad05f" containerID="b16a1e0dea5c35e5a713c52318aa554fbd5a170e6b1beda799e52b7f33cd1c7b" exitCode=143 Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.251567 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d937ab3-6dfb-4b0c-a846-da4820bad05f","Type":"ContainerDied","Data":"b16a1e0dea5c35e5a713c52318aa554fbd5a170e6b1beda799e52b7f33cd1c7b"} Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.289361 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drk7f\" (UniqueName: \"kubernetes.io/projected/bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df-kube-api-access-drk7f\") pod \"auto-csr-approver-29562454-s7z5z\" (UID: \"bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df\") " pod="openshift-infra/auto-csr-approver-29562454-s7z5z" Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.289848 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5485d7d4fb-62qtm"] Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.290144 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5485d7d4fb-62qtm" podUID="b078827f-c462-4bbb-8d77-06a978218545" containerName="placement-log" containerID="cri-o://ce8b09bc00016b52c043dd68c02387a5a892be022880d1ac0dc4c2353c9a165b" gracePeriod=30 Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.290664 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5485d7d4fb-62qtm" podUID="b078827f-c462-4bbb-8d77-06a978218545" containerName="placement-api" containerID="cri-o://3325ce256720c2da33849b80aa0da173fb655eaaba82d00416cacdf1b1b8e0f6" gracePeriod=30 Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.391381 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drk7f\" (UniqueName: \"kubernetes.io/projected/bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df-kube-api-access-drk7f\") pod \"auto-csr-approver-29562454-s7z5z\" (UID: \"bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df\") " pod="openshift-infra/auto-csr-approver-29562454-s7z5z" Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.418827 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drk7f\" (UniqueName: \"kubernetes.io/projected/bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df-kube-api-access-drk7f\") pod \"auto-csr-approver-29562454-s7z5z\" (UID: \"bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df\") " pod="openshift-infra/auto-csr-approver-29562454-s7z5z" Mar 17 11:34:00 crc kubenswrapper[4742]: I0317 11:34:00.491447 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562454-s7z5z" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.002334 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.015970 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-httpd-config\") pod \"69896e76-e60c-4941-b013-a702791923ec\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.017083 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt2z9\" (UniqueName: \"kubernetes.io/projected/69896e76-e60c-4941-b013-a702791923ec-kube-api-access-vt2z9\") pod \"69896e76-e60c-4941-b013-a702791923ec\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.017199 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-ovndb-tls-certs\") pod \"69896e76-e60c-4941-b013-a702791923ec\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.017228 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-combined-ca-bundle\") pod \"69896e76-e60c-4941-b013-a702791923ec\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.017254 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-config\") pod \"69896e76-e60c-4941-b013-a702791923ec\" (UID: \"69896e76-e60c-4941-b013-a702791923ec\") " Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.028428 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69896e76-e60c-4941-b013-a702791923ec-kube-api-access-vt2z9" (OuterVolumeSpecName: "kube-api-access-vt2z9") pod "69896e76-e60c-4941-b013-a702791923ec" (UID: "69896e76-e60c-4941-b013-a702791923ec"). InnerVolumeSpecName "kube-api-access-vt2z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.028640 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "69896e76-e60c-4941-b013-a702791923ec" (UID: "69896e76-e60c-4941-b013-a702791923ec"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.124059 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt2z9\" (UniqueName: \"kubernetes.io/projected/69896e76-e60c-4941-b013-a702791923ec-kube-api-access-vt2z9\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.124090 4742 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.127357 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-config" (OuterVolumeSpecName: "config") pod "69896e76-e60c-4941-b013-a702791923ec" (UID: "69896e76-e60c-4941-b013-a702791923ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.175111 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69896e76-e60c-4941-b013-a702791923ec" (UID: "69896e76-e60c-4941-b013-a702791923ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.176896 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.199692 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "69896e76-e60c-4941-b013-a702791923ec" (UID: "69896e76-e60c-4941-b013-a702791923ec"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.237197 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.237670 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-combined-ca-bundle\") pod \"f2bbef92-cd02-42d8-b81d-ab7248e29328\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.237730 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m62rh\" (UniqueName: \"kubernetes.io/projected/f2bbef92-cd02-42d8-b81d-ab7248e29328-kube-api-access-m62rh\") pod \"f2bbef92-cd02-42d8-b81d-ab7248e29328\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.237823 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-horizon-tls-certs\") pod \"f2bbef92-cd02-42d8-b81d-ab7248e29328\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.237929 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-horizon-secret-key\") pod \"f2bbef92-cd02-42d8-b81d-ab7248e29328\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.237978 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2bbef92-cd02-42d8-b81d-ab7248e29328-scripts\") pod \"f2bbef92-cd02-42d8-b81d-ab7248e29328\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.238015 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2bbef92-cd02-42d8-b81d-ab7248e29328-logs\") pod \"f2bbef92-cd02-42d8-b81d-ab7248e29328\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.238135 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2bbef92-cd02-42d8-b81d-ab7248e29328-config-data\") pod \"f2bbef92-cd02-42d8-b81d-ab7248e29328\" (UID: \"f2bbef92-cd02-42d8-b81d-ab7248e29328\") " Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.238640 4742 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.238667 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.238679 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/69896e76-e60c-4941-b013-a702791923ec-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.248413 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2bbef92-cd02-42d8-b81d-ab7248e29328-logs" (OuterVolumeSpecName: "logs") pod "f2bbef92-cd02-42d8-b81d-ab7248e29328" (UID: "f2bbef92-cd02-42d8-b81d-ab7248e29328"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.249510 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f2bbef92-cd02-42d8-b81d-ab7248e29328" (UID: "f2bbef92-cd02-42d8-b81d-ab7248e29328"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.250799 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2bbef92-cd02-42d8-b81d-ab7248e29328-kube-api-access-m62rh" (OuterVolumeSpecName: "kube-api-access-m62rh") pod "f2bbef92-cd02-42d8-b81d-ab7248e29328" (UID: "f2bbef92-cd02-42d8-b81d-ab7248e29328"). InnerVolumeSpecName "kube-api-access-m62rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.266554 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2bbef92-cd02-42d8-b81d-ab7248e29328-config-data" (OuterVolumeSpecName: "config-data") pod "f2bbef92-cd02-42d8-b81d-ab7248e29328" (UID: "f2bbef92-cd02-42d8-b81d-ab7248e29328"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.267497 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2bbef92-cd02-42d8-b81d-ab7248e29328-scripts" (OuterVolumeSpecName: "scripts") pod "f2bbef92-cd02-42d8-b81d-ab7248e29328" (UID: "f2bbef92-cd02-42d8-b81d-ab7248e29328"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.268704 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bq6vr" event={"ID":"e15fe5ee-73d7-415a-a61c-a0e67d085f3a","Type":"ContainerStarted","Data":"49c102ea2f25979bc529898327d689137acdb1c3e0ef759e8b4da71e4736f9aa"} Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.277768 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbc75d594-mvhf5" event={"ID":"f2bbef92-cd02-42d8-b81d-ab7248e29328","Type":"ContainerDied","Data":"f7ff99f1880036e2d5976ac2a62e3410d8bb4ed86b7d89695323874b86be177c"} Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.277808 4742 scope.go:117] "RemoveContainer" containerID="bdba66d58b80df369c99a5b4d40c5a7e87b7ad31620cffac6b75800835ccef63" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.277960 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cbc75d594-mvhf5" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.284428 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2bbef92-cd02-42d8-b81d-ab7248e29328" (UID: "f2bbef92-cd02-42d8-b81d-ab7248e29328"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.289258 4742 generic.go:334] "Generic (PLEG): container finished" podID="b078827f-c462-4bbb-8d77-06a978218545" containerID="ce8b09bc00016b52c043dd68c02387a5a892be022880d1ac0dc4c2353c9a165b" exitCode=143 Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.289312 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5485d7d4fb-62qtm" event={"ID":"b078827f-c462-4bbb-8d77-06a978218545","Type":"ContainerDied","Data":"ce8b09bc00016b52c043dd68c02387a5a892be022880d1ac0dc4c2353c9a165b"} Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.296171 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-bq6vr" podStartSLOduration=7.303192156 podStartE2EDuration="17.296153283s" podCreationTimestamp="2026-03-17 11:33:44 +0000 UTC" firstStartedPulling="2026-03-17 11:33:50.939384462 +0000 UTC m=+1334.065512220" lastFinishedPulling="2026-03-17 11:34:00.932345589 +0000 UTC m=+1344.058473347" observedRunningTime="2026-03-17 11:34:01.288833819 +0000 UTC m=+1344.414961577" watchObservedRunningTime="2026-03-17 11:34:01.296153283 +0000 UTC m=+1344.422281041" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.305801 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25974d2f-0187-46fd-8d7a-4ac3c5b555d0","Type":"ContainerStarted","Data":"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18"} Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.307688 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.311364 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "f2bbef92-cd02-42d8-b81d-ab7248e29328" (UID: "f2bbef92-cd02-42d8-b81d-ab7248e29328"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.315411 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9d44b9d7b-r5znz" event={"ID":"69896e76-e60c-4941-b013-a702791923ec","Type":"ContainerDied","Data":"4fc0b50a962ed4dea4c4d507eee8424247b69b59a9f391dc5f262e36cfcffcf0"} Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.315482 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9d44b9d7b-r5znz" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.333018 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.475981132 podStartE2EDuration="10.332994788s" podCreationTimestamp="2026-03-17 11:33:51 +0000 UTC" firstStartedPulling="2026-03-17 11:33:52.016771892 +0000 UTC m=+1335.142899660" lastFinishedPulling="2026-03-17 11:34:00.873785568 +0000 UTC m=+1343.999913316" observedRunningTime="2026-03-17 11:34:01.331262911 +0000 UTC m=+1344.457390669" watchObservedRunningTime="2026-03-17 11:34:01.332994788 +0000 UTC m=+1344.459122546" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.339838 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.339865 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m62rh\" (UniqueName: \"kubernetes.io/projected/f2bbef92-cd02-42d8-b81d-ab7248e29328-kube-api-access-m62rh\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.339876 4742 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.339885 4742 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f2bbef92-cd02-42d8-b81d-ab7248e29328-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.339894 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2bbef92-cd02-42d8-b81d-ab7248e29328-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.339920 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2bbef92-cd02-42d8-b81d-ab7248e29328-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.339935 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f2bbef92-cd02-42d8-b81d-ab7248e29328-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.365350 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9d44b9d7b-r5znz"] Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.375670 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9d44b9d7b-r5znz"] Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.411117 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562454-s7z5z"] Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.455597 4742 scope.go:117] "RemoveContainer" containerID="ae60630080e9cc570d98471ac0c31f7d1dbbc55f4cd0bf7020ef89cc50cc5e24" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.549941 4742 scope.go:117] "RemoveContainer" containerID="20d2c94ecc394709e98a453bbe007a230f2c3d20068fb621f4384bce882f1336" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.571113 4742 scope.go:117] "RemoveContainer" containerID="831de658b40e525dec4242bdf49d13b3b0065da3ec5d4638a8e9dc7531098e33" Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.626442 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cbc75d594-mvhf5"] Mar 17 11:34:01 crc kubenswrapper[4742]: I0317 11:34:01.656529 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5cbc75d594-mvhf5"] Mar 17 11:34:02 crc kubenswrapper[4742]: I0317 11:34:02.132008 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:34:02 crc kubenswrapper[4742]: I0317 11:34:02.133456 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="24b880a5-c4dc-4566-80c3-13fddf078932" containerName="glance-httpd" containerID="cri-o://8f9fd3cf05b1b274b0195ccc59bfc69550a213322d5dfbdf127ca7bc53d87c06" gracePeriod=30 Mar 17 11:34:02 crc kubenswrapper[4742]: I0317 11:34:02.133647 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="24b880a5-c4dc-4566-80c3-13fddf078932" containerName="glance-log" containerID="cri-o://21dfba13d6ea06b5e5b7ca6e464fcedbd33bca247f133278945a35a27374bda5" gracePeriod=30 Mar 17 11:34:02 crc kubenswrapper[4742]: I0317 11:34:02.328421 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562454-s7z5z" event={"ID":"bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df","Type":"ContainerStarted","Data":"127ef21f250feed9ca6b7f3a401e2e002e77aaa98c3e7c6d311fc59c4b0c0789"} Mar 17 11:34:02 crc kubenswrapper[4742]: I0317 11:34:02.331536 4742 generic.go:334] "Generic (PLEG): container finished" podID="24b880a5-c4dc-4566-80c3-13fddf078932" containerID="21dfba13d6ea06b5e5b7ca6e464fcedbd33bca247f133278945a35a27374bda5" exitCode=143 Mar 17 11:34:02 crc kubenswrapper[4742]: I0317 11:34:02.331615 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24b880a5-c4dc-4566-80c3-13fddf078932","Type":"ContainerDied","Data":"21dfba13d6ea06b5e5b7ca6e464fcedbd33bca247f133278945a35a27374bda5"} Mar 17 11:34:02 crc kubenswrapper[4742]: I0317 11:34:02.332460 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="ceilometer-central-agent" containerID="cri-o://c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57" gracePeriod=30 Mar 17 11:34:02 crc kubenswrapper[4742]: I0317 11:34:02.332789 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="proxy-httpd" containerID="cri-o://aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18" gracePeriod=30 Mar 17 11:34:02 crc kubenswrapper[4742]: I0317 11:34:02.332857 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="ceilometer-notification-agent" containerID="cri-o://dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26" gracePeriod=30 Mar 17 11:34:02 crc kubenswrapper[4742]: I0317 11:34:02.332977 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="sg-core" containerID="cri-o://12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1" gracePeriod=30 Mar 17 11:34:02 crc kubenswrapper[4742]: I0317 11:34:02.675927 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69896e76-e60c-4941-b013-a702791923ec" path="/var/lib/kubelet/pods/69896e76-e60c-4941-b013-a702791923ec/volumes" Mar 17 11:34:02 crc kubenswrapper[4742]: I0317 11:34:02.676545 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" path="/var/lib/kubelet/pods/f2bbef92-cd02-42d8-b81d-ab7248e29328/volumes" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.305158 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.373716 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-sg-core-conf-yaml\") pod \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.374010 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-config-data\") pod \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.374035 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-log-httpd\") pod \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.374062 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6lxc\" (UniqueName: \"kubernetes.io/projected/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-kube-api-access-q6lxc\") pod \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.374099 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-run-httpd\") pod \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.374116 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-combined-ca-bundle\") pod \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.374158 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-scripts\") pod \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\" (UID: \"25974d2f-0187-46fd-8d7a-4ac3c5b555d0\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.376332 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "25974d2f-0187-46fd-8d7a-4ac3c5b555d0" (UID: "25974d2f-0187-46fd-8d7a-4ac3c5b555d0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.377202 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "25974d2f-0187-46fd-8d7a-4ac3c5b555d0" (UID: "25974d2f-0187-46fd-8d7a-4ac3c5b555d0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.379494 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562454-s7z5z" event={"ID":"bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df","Type":"ContainerStarted","Data":"9a27c20a440ff32b968b604ff8353be71ac1dc698779ef00ba40c41a91b652a5"} Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.388942 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-kube-api-access-q6lxc" (OuterVolumeSpecName: "kube-api-access-q6lxc") pod "25974d2f-0187-46fd-8d7a-4ac3c5b555d0" (UID: "25974d2f-0187-46fd-8d7a-4ac3c5b555d0"). InnerVolumeSpecName "kube-api-access-q6lxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.399607 4742 generic.go:334] "Generic (PLEG): container finished" podID="5d937ab3-6dfb-4b0c-a846-da4820bad05f" containerID="471b1b0b9bc05fd61003bdefb4957ca8755e2dd3462d5fa147269aa7e6143ecf" exitCode=0 Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.399702 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d937ab3-6dfb-4b0c-a846-da4820bad05f","Type":"ContainerDied","Data":"471b1b0b9bc05fd61003bdefb4957ca8755e2dd3462d5fa147269aa7e6143ecf"} Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.415784 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562454-s7z5z" podStartSLOduration=2.278551727 podStartE2EDuration="3.415767493s" podCreationTimestamp="2026-03-17 11:34:00 +0000 UTC" firstStartedPulling="2026-03-17 11:34:01.482609546 +0000 UTC m=+1344.608737304" lastFinishedPulling="2026-03-17 11:34:02.619825312 +0000 UTC m=+1345.745953070" observedRunningTime="2026-03-17 11:34:03.401486485 +0000 UTC m=+1346.527614243" watchObservedRunningTime="2026-03-17 11:34:03.415767493 +0000 UTC m=+1346.541895251" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.421118 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.421960 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25974d2f-0187-46fd-8d7a-4ac3c5b555d0","Type":"ContainerDied","Data":"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18"} Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.422032 4742 scope.go:117] "RemoveContainer" containerID="aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.424561 4742 generic.go:334] "Generic (PLEG): container finished" podID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerID="aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18" exitCode=0 Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.424650 4742 generic.go:334] "Generic (PLEG): container finished" podID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerID="12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1" exitCode=2 Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.424662 4742 generic.go:334] "Generic (PLEG): container finished" podID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerID="dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26" exitCode=0 Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.424670 4742 generic.go:334] "Generic (PLEG): container finished" podID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerID="c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57" exitCode=0 Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.424696 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25974d2f-0187-46fd-8d7a-4ac3c5b555d0","Type":"ContainerDied","Data":"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1"} Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.424741 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25974d2f-0187-46fd-8d7a-4ac3c5b555d0","Type":"ContainerDied","Data":"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26"} Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.425099 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25974d2f-0187-46fd-8d7a-4ac3c5b555d0","Type":"ContainerDied","Data":"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57"} Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.425111 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"25974d2f-0187-46fd-8d7a-4ac3c5b555d0","Type":"ContainerDied","Data":"2c83fad22fa63229c300407650b27b1a64e7842bfefe6fb3fb3628d75b048074"} Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.426078 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-scripts" (OuterVolumeSpecName: "scripts") pod "25974d2f-0187-46fd-8d7a-4ac3c5b555d0" (UID: "25974d2f-0187-46fd-8d7a-4ac3c5b555d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.446660 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "25974d2f-0187-46fd-8d7a-4ac3c5b555d0" (UID: "25974d2f-0187-46fd-8d7a-4ac3c5b555d0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.476454 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6lxc\" (UniqueName: \"kubernetes.io/projected/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-kube-api-access-q6lxc\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.476488 4742 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.476503 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.476517 4742 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.476529 4742 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.557359 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25974d2f-0187-46fd-8d7a-4ac3c5b555d0" (UID: "25974d2f-0187-46fd-8d7a-4ac3c5b555d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.576141 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-config-data" (OuterVolumeSpecName: "config-data") pod "25974d2f-0187-46fd-8d7a-4ac3c5b555d0" (UID: "25974d2f-0187-46fd-8d7a-4ac3c5b555d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.577926 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.577956 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25974d2f-0187-46fd-8d7a-4ac3c5b555d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.805282 4742 scope.go:117] "RemoveContainer" containerID="12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.810953 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.832311 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.843843 4742 scope.go:117] "RemoveContainer" containerID="dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.852329 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.860317 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.863982 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.864531 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerName="horizon" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.864554 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerName="horizon" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.864574 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerName="horizon-log" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.864583 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerName="horizon-log" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.864593 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69896e76-e60c-4941-b013-a702791923ec" containerName="neutron-httpd" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.864602 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="69896e76-e60c-4941-b013-a702791923ec" containerName="neutron-httpd" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.864619 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="ceilometer-notification-agent" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.864627 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="ceilometer-notification-agent" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.864648 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="ceilometer-central-agent" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.864656 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="ceilometer-central-agent" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.864670 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="sg-core" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.864678 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="sg-core" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.864700 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d937ab3-6dfb-4b0c-a846-da4820bad05f" containerName="glance-httpd" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.864707 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d937ab3-6dfb-4b0c-a846-da4820bad05f" containerName="glance-httpd" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.864717 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b078827f-c462-4bbb-8d77-06a978218545" containerName="placement-api" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.864724 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="b078827f-c462-4bbb-8d77-06a978218545" containerName="placement-api" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.864739 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69896e76-e60c-4941-b013-a702791923ec" containerName="neutron-api" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.864747 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="69896e76-e60c-4941-b013-a702791923ec" containerName="neutron-api" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.864766 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d937ab3-6dfb-4b0c-a846-da4820bad05f" containerName="glance-log" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.864788 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d937ab3-6dfb-4b0c-a846-da4820bad05f" containerName="glance-log" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.864805 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="proxy-httpd" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.864814 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="proxy-httpd" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.864829 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b078827f-c462-4bbb-8d77-06a978218545" containerName="placement-log" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.864840 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="b078827f-c462-4bbb-8d77-06a978218545" containerName="placement-log" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.865081 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="69896e76-e60c-4941-b013-a702791923ec" containerName="neutron-api" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.865166 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d937ab3-6dfb-4b0c-a846-da4820bad05f" containerName="glance-log" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.865186 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="proxy-httpd" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.865197 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="ceilometer-central-agent" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.865210 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="ceilometer-notification-agent" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.865220 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d937ab3-6dfb-4b0c-a846-da4820bad05f" containerName="glance-httpd" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.865235 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerName="horizon-log" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.865249 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="b078827f-c462-4bbb-8d77-06a978218545" containerName="placement-api" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.865261 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2bbef92-cd02-42d8-b81d-ab7248e29328" containerName="horizon" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.865271 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" containerName="sg-core" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.865279 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="b078827f-c462-4bbb-8d77-06a978218545" containerName="placement-log" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.865293 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="69896e76-e60c-4941-b013-a702791923ec" containerName="neutron-httpd" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.878713 4742 scope.go:117] "RemoveContainer" containerID="c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.880387 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.880505 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.882374 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.882552 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883140 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-scripts\") pod \"b078827f-c462-4bbb-8d77-06a978218545\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883208 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-combined-ca-bundle\") pod \"b078827f-c462-4bbb-8d77-06a978218545\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883256 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-config-data\") pod \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883308 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d937ab3-6dfb-4b0c-a846-da4820bad05f-logs\") pod \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883418 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-internal-tls-certs\") pod \"b078827f-c462-4bbb-8d77-06a978218545\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883485 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d937ab3-6dfb-4b0c-a846-da4820bad05f-httpd-run\") pod \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883540 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-scripts\") pod \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883565 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b078827f-c462-4bbb-8d77-06a978218545-logs\") pod \"b078827f-c462-4bbb-8d77-06a978218545\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883604 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-public-tls-certs\") pod \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883627 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pck8\" (UniqueName: \"kubernetes.io/projected/5d937ab3-6dfb-4b0c-a846-da4820bad05f-kube-api-access-2pck8\") pod \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883654 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsrf6\" (UniqueName: \"kubernetes.io/projected/b078827f-c462-4bbb-8d77-06a978218545-kube-api-access-jsrf6\") pod \"b078827f-c462-4bbb-8d77-06a978218545\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883694 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883768 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-combined-ca-bundle\") pod \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\" (UID: \"5d937ab3-6dfb-4b0c-a846-da4820bad05f\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883796 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-public-tls-certs\") pod \"b078827f-c462-4bbb-8d77-06a978218545\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.883826 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-config-data\") pod \"b078827f-c462-4bbb-8d77-06a978218545\" (UID: \"b078827f-c462-4bbb-8d77-06a978218545\") " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.884137 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d937ab3-6dfb-4b0c-a846-da4820bad05f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5d937ab3-6dfb-4b0c-a846-da4820bad05f" (UID: "5d937ab3-6dfb-4b0c-a846-da4820bad05f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.885055 4742 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5d937ab3-6dfb-4b0c-a846-da4820bad05f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.889222 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d937ab3-6dfb-4b0c-a846-da4820bad05f-logs" (OuterVolumeSpecName: "logs") pod "5d937ab3-6dfb-4b0c-a846-da4820bad05f" (UID: "5d937ab3-6dfb-4b0c-a846-da4820bad05f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.889749 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b078827f-c462-4bbb-8d77-06a978218545-logs" (OuterVolumeSpecName: "logs") pod "b078827f-c462-4bbb-8d77-06a978218545" (UID: "b078827f-c462-4bbb-8d77-06a978218545"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.892553 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-scripts" (OuterVolumeSpecName: "scripts") pod "b078827f-c462-4bbb-8d77-06a978218545" (UID: "b078827f-c462-4bbb-8d77-06a978218545"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.894162 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "5d937ab3-6dfb-4b0c-a846-da4820bad05f" (UID: "5d937ab3-6dfb-4b0c-a846-da4820bad05f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.898273 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b078827f-c462-4bbb-8d77-06a978218545-kube-api-access-jsrf6" (OuterVolumeSpecName: "kube-api-access-jsrf6") pod "b078827f-c462-4bbb-8d77-06a978218545" (UID: "b078827f-c462-4bbb-8d77-06a978218545"). InnerVolumeSpecName "kube-api-access-jsrf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.903360 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d937ab3-6dfb-4b0c-a846-da4820bad05f-kube-api-access-2pck8" (OuterVolumeSpecName: "kube-api-access-2pck8") pod "5d937ab3-6dfb-4b0c-a846-da4820bad05f" (UID: "5d937ab3-6dfb-4b0c-a846-da4820bad05f"). InnerVolumeSpecName "kube-api-access-2pck8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.927965 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-scripts" (OuterVolumeSpecName: "scripts") pod "5d937ab3-6dfb-4b0c-a846-da4820bad05f" (UID: "5d937ab3-6dfb-4b0c-a846-da4820bad05f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.930259 4742 scope.go:117] "RemoveContainer" containerID="aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.937656 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18\": container with ID starting with aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18 not found: ID does not exist" containerID="aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.937708 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18"} err="failed to get container status \"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18\": rpc error: code = NotFound desc = could not find container \"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18\": container with ID starting with aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.937738 4742 scope.go:117] "RemoveContainer" containerID="12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.938221 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1\": container with ID starting with 12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1 not found: ID does not exist" containerID="12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.938266 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1"} err="failed to get container status \"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1\": rpc error: code = NotFound desc = could not find container \"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1\": container with ID starting with 12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.938284 4742 scope.go:117] "RemoveContainer" containerID="dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.938875 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26\": container with ID starting with dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26 not found: ID does not exist" containerID="dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.938929 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26"} err="failed to get container status \"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26\": rpc error: code = NotFound desc = could not find container \"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26\": container with ID starting with dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.938956 4742 scope.go:117] "RemoveContainer" containerID="c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57" Mar 17 11:34:03 crc kubenswrapper[4742]: E0317 11:34:03.942264 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57\": container with ID starting with c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57 not found: ID does not exist" containerID="c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.942297 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57"} err="failed to get container status \"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57\": rpc error: code = NotFound desc = could not find container \"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57\": container with ID starting with c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.942321 4742 scope.go:117] "RemoveContainer" containerID="aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.942684 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18"} err="failed to get container status \"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18\": rpc error: code = NotFound desc = could not find container \"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18\": container with ID starting with aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.942702 4742 scope.go:117] "RemoveContainer" containerID="12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.943040 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1"} err="failed to get container status \"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1\": rpc error: code = NotFound desc = could not find container \"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1\": container with ID starting with 12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.943059 4742 scope.go:117] "RemoveContainer" containerID="dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.943328 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26"} err="failed to get container status \"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26\": rpc error: code = NotFound desc = could not find container \"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26\": container with ID starting with dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.943352 4742 scope.go:117] "RemoveContainer" containerID="c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.943623 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57"} err="failed to get container status \"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57\": rpc error: code = NotFound desc = could not find container \"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57\": container with ID starting with c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.943649 4742 scope.go:117] "RemoveContainer" containerID="aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.943922 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18"} err="failed to get container status \"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18\": rpc error: code = NotFound desc = could not find container \"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18\": container with ID starting with aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.943944 4742 scope.go:117] "RemoveContainer" containerID="12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.945311 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1"} err="failed to get container status \"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1\": rpc error: code = NotFound desc = could not find container \"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1\": container with ID starting with 12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.945340 4742 scope.go:117] "RemoveContainer" containerID="dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.945669 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26"} err="failed to get container status \"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26\": rpc error: code = NotFound desc = could not find container \"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26\": container with ID starting with dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.945706 4742 scope.go:117] "RemoveContainer" containerID="c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.946317 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57"} err="failed to get container status \"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57\": rpc error: code = NotFound desc = could not find container \"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57\": container with ID starting with c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.946342 4742 scope.go:117] "RemoveContainer" containerID="aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.946622 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18"} err="failed to get container status \"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18\": rpc error: code = NotFound desc = could not find container \"aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18\": container with ID starting with aa7f47488f91952bdc75dc62eb5dad593877d7080d4510d8e00ecdd00805fa18 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.946662 4742 scope.go:117] "RemoveContainer" containerID="12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.946964 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1"} err="failed to get container status \"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1\": rpc error: code = NotFound desc = could not find container \"12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1\": container with ID starting with 12db0eda6e978f96326f6de2262200483002d8723461b0259ecadaf36f4d7ef1 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.947009 4742 scope.go:117] "RemoveContainer" containerID="dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.947622 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26"} err="failed to get container status \"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26\": rpc error: code = NotFound desc = could not find container \"dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26\": container with ID starting with dbb1b852b70902c1f86a6908514e49222512f7f024510aed014198d142569c26 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.947653 4742 scope.go:117] "RemoveContainer" containerID="c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.948183 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57"} err="failed to get container status \"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57\": rpc error: code = NotFound desc = could not find container \"c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57\": container with ID starting with c8ccb4604da9b32ba3b0c51390f1a1b3e22eed1a5776dddb9339bf8e58efaf57 not found: ID does not exist" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.962977 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d937ab3-6dfb-4b0c-a846-da4820bad05f" (UID: "5d937ab3-6dfb-4b0c-a846-da4820bad05f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.964091 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b078827f-c462-4bbb-8d77-06a978218545" (UID: "b078827f-c462-4bbb-8d77-06a978218545"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.986845 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-config-data\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.987056 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cmk5\" (UniqueName: \"kubernetes.io/projected/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-kube-api-access-2cmk5\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.987458 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.987631 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-scripts\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.987763 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.987836 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-run-httpd\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.987854 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-log-httpd\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.988015 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.988031 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d937ab3-6dfb-4b0c-a846-da4820bad05f-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.988043 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.988053 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b078827f-c462-4bbb-8d77-06a978218545-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.988065 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pck8\" (UniqueName: \"kubernetes.io/projected/5d937ab3-6dfb-4b0c-a846-da4820bad05f-kube-api-access-2pck8\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.988077 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsrf6\" (UniqueName: \"kubernetes.io/projected/b078827f-c462-4bbb-8d77-06a978218545-kube-api-access-jsrf6\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.988104 4742 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.988119 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.988131 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:03 crc kubenswrapper[4742]: I0317 11:34:03.999421 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-config-data" (OuterVolumeSpecName: "config-data") pod "5d937ab3-6dfb-4b0c-a846-da4820bad05f" (UID: "5d937ab3-6dfb-4b0c-a846-da4820bad05f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.005156 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-config-data" (OuterVolumeSpecName: "config-data") pod "b078827f-c462-4bbb-8d77-06a978218545" (UID: "b078827f-c462-4bbb-8d77-06a978218545"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.017631 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5d937ab3-6dfb-4b0c-a846-da4820bad05f" (UID: "5d937ab3-6dfb-4b0c-a846-da4820bad05f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.018129 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b078827f-c462-4bbb-8d77-06a978218545" (UID: "b078827f-c462-4bbb-8d77-06a978218545"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.020882 4742 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.070881 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b078827f-c462-4bbb-8d77-06a978218545" (UID: "b078827f-c462-4bbb-8d77-06a978218545"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.090960 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-config-data\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.091307 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cmk5\" (UniqueName: \"kubernetes.io/projected/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-kube-api-access-2cmk5\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.091427 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.091571 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-scripts\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.091726 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.091885 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-run-httpd\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.092009 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-log-httpd\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.092186 4742 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.092280 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.092411 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.092948 4742 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b078827f-c462-4bbb-8d77-06a978218545-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.093112 4742 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d937ab3-6dfb-4b0c-a846-da4820bad05f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.093204 4742 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.093021 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-log-httpd\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.094336 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-run-httpd\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.108053 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.108618 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-config-data\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.109701 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-scripts\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.110430 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cmk5\" (UniqueName: \"kubernetes.io/projected/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-kube-api-access-2cmk5\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.111475 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.217175 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.435918 4742 generic.go:334] "Generic (PLEG): container finished" podID="b078827f-c462-4bbb-8d77-06a978218545" containerID="3325ce256720c2da33849b80aa0da173fb655eaaba82d00416cacdf1b1b8e0f6" exitCode=0 Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.435992 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5485d7d4fb-62qtm" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.436017 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5485d7d4fb-62qtm" event={"ID":"b078827f-c462-4bbb-8d77-06a978218545","Type":"ContainerDied","Data":"3325ce256720c2da33849b80aa0da173fb655eaaba82d00416cacdf1b1b8e0f6"} Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.436356 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5485d7d4fb-62qtm" event={"ID":"b078827f-c462-4bbb-8d77-06a978218545","Type":"ContainerDied","Data":"8cbf74bf60be739cab962c034fc899f41be1478339cef37bfbddfb78ef382cb2"} Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.436382 4742 scope.go:117] "RemoveContainer" containerID="3325ce256720c2da33849b80aa0da173fb655eaaba82d00416cacdf1b1b8e0f6" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.461085 4742 generic.go:334] "Generic (PLEG): container finished" podID="bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df" containerID="9a27c20a440ff32b968b604ff8353be71ac1dc698779ef00ba40c41a91b652a5" exitCode=0 Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.461408 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562454-s7z5z" event={"ID":"bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df","Type":"ContainerDied","Data":"9a27c20a440ff32b968b604ff8353be71ac1dc698779ef00ba40c41a91b652a5"} Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.470842 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5d937ab3-6dfb-4b0c-a846-da4820bad05f","Type":"ContainerDied","Data":"1559594b60e090c6fda96076e1e12cbefd8b566ffab8c5b3b508707d9ffd7d64"} Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.470964 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.539138 4742 scope.go:117] "RemoveContainer" containerID="ce8b09bc00016b52c043dd68c02387a5a892be022880d1ac0dc4c2353c9a165b" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.561044 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.576432 4742 scope.go:117] "RemoveContainer" containerID="3325ce256720c2da33849b80aa0da173fb655eaaba82d00416cacdf1b1b8e0f6" Mar 17 11:34:04 crc kubenswrapper[4742]: E0317 11:34:04.578230 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3325ce256720c2da33849b80aa0da173fb655eaaba82d00416cacdf1b1b8e0f6\": container with ID starting with 3325ce256720c2da33849b80aa0da173fb655eaaba82d00416cacdf1b1b8e0f6 not found: ID does not exist" containerID="3325ce256720c2da33849b80aa0da173fb655eaaba82d00416cacdf1b1b8e0f6" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.578267 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3325ce256720c2da33849b80aa0da173fb655eaaba82d00416cacdf1b1b8e0f6"} err="failed to get container status \"3325ce256720c2da33849b80aa0da173fb655eaaba82d00416cacdf1b1b8e0f6\": rpc error: code = NotFound desc = could not find container \"3325ce256720c2da33849b80aa0da173fb655eaaba82d00416cacdf1b1b8e0f6\": container with ID starting with 3325ce256720c2da33849b80aa0da173fb655eaaba82d00416cacdf1b1b8e0f6 not found: ID does not exist" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.578288 4742 scope.go:117] "RemoveContainer" containerID="ce8b09bc00016b52c043dd68c02387a5a892be022880d1ac0dc4c2353c9a165b" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.579068 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:34:04 crc kubenswrapper[4742]: E0317 11:34:04.579458 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce8b09bc00016b52c043dd68c02387a5a892be022880d1ac0dc4c2353c9a165b\": container with ID starting with ce8b09bc00016b52c043dd68c02387a5a892be022880d1ac0dc4c2353c9a165b not found: ID does not exist" containerID="ce8b09bc00016b52c043dd68c02387a5a892be022880d1ac0dc4c2353c9a165b" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.579489 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8b09bc00016b52c043dd68c02387a5a892be022880d1ac0dc4c2353c9a165b"} err="failed to get container status \"ce8b09bc00016b52c043dd68c02387a5a892be022880d1ac0dc4c2353c9a165b\": rpc error: code = NotFound desc = could not find container \"ce8b09bc00016b52c043dd68c02387a5a892be022880d1ac0dc4c2353c9a165b\": container with ID starting with ce8b09bc00016b52c043dd68c02387a5a892be022880d1ac0dc4c2353c9a165b not found: ID does not exist" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.579514 4742 scope.go:117] "RemoveContainer" containerID="471b1b0b9bc05fd61003bdefb4957ca8755e2dd3462d5fa147269aa7e6143ecf" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.602949 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5485d7d4fb-62qtm"] Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.605067 4742 scope.go:117] "RemoveContainer" containerID="b16a1e0dea5c35e5a713c52318aa554fbd5a170e6b1beda799e52b7f33cd1c7b" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.618529 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5485d7d4fb-62qtm"] Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.633568 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.635130 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.641760 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.642061 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.642646 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.673450 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25974d2f-0187-46fd-8d7a-4ac3c5b555d0" path="/var/lib/kubelet/pods/25974d2f-0187-46fd-8d7a-4ac3c5b555d0/volumes" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.674337 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d937ab3-6dfb-4b0c-a846-da4820bad05f" path="/var/lib/kubelet/pods/5d937ab3-6dfb-4b0c-a846-da4820bad05f/volumes" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.678061 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b078827f-c462-4bbb-8d77-06a978218545" path="/var/lib/kubelet/pods/b078827f-c462-4bbb-8d77-06a978218545/volumes" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.707083 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdc48ac3-7501-4e63-9290-bff06909b045-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.707137 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.707155 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdc48ac3-7501-4e63-9290-bff06909b045-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.707174 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdc48ac3-7501-4e63-9290-bff06909b045-config-data\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.707215 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw9qg\" (UniqueName: \"kubernetes.io/projected/fdc48ac3-7501-4e63-9290-bff06909b045-kube-api-access-tw9qg\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.707252 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdc48ac3-7501-4e63-9290-bff06909b045-logs\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.707303 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdc48ac3-7501-4e63-9290-bff06909b045-scripts\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.707328 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc48ac3-7501-4e63-9290-bff06909b045-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.713446 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.809382 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw9qg\" (UniqueName: \"kubernetes.io/projected/fdc48ac3-7501-4e63-9290-bff06909b045-kube-api-access-tw9qg\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.809724 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdc48ac3-7501-4e63-9290-bff06909b045-logs\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.809756 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdc48ac3-7501-4e63-9290-bff06909b045-scripts\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.809783 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc48ac3-7501-4e63-9290-bff06909b045-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.809867 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdc48ac3-7501-4e63-9290-bff06909b045-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.809896 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.809936 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdc48ac3-7501-4e63-9290-bff06909b045-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.809960 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdc48ac3-7501-4e63-9290-bff06909b045-config-data\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.810997 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.813395 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fdc48ac3-7501-4e63-9290-bff06909b045-logs\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.813639 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fdc48ac3-7501-4e63-9290-bff06909b045-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.816090 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fdc48ac3-7501-4e63-9290-bff06909b045-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.820883 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdc48ac3-7501-4e63-9290-bff06909b045-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.821184 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdc48ac3-7501-4e63-9290-bff06909b045-scripts\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.830387 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdc48ac3-7501-4e63-9290-bff06909b045-config-data\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.836699 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw9qg\" (UniqueName: \"kubernetes.io/projected/fdc48ac3-7501-4e63-9290-bff06909b045-kube-api-access-tw9qg\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.861264 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"fdc48ac3-7501-4e63-9290-bff06909b045\") " pod="openstack/glance-default-external-api-0" Mar 17 11:34:04 crc kubenswrapper[4742]: I0317 11:34:04.956972 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 17 11:34:05 crc kubenswrapper[4742]: I0317 11:34:05.479647 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7","Type":"ContainerStarted","Data":"b445c489d60d0b5764154a19772023ad4ce9eac67f8f5fb7851f7a2f499cd05c"} Mar 17 11:34:05 crc kubenswrapper[4742]: I0317 11:34:05.481140 4742 generic.go:334] "Generic (PLEG): container finished" podID="24b880a5-c4dc-4566-80c3-13fddf078932" containerID="8f9fd3cf05b1b274b0195ccc59bfc69550a213322d5dfbdf127ca7bc53d87c06" exitCode=0 Mar 17 11:34:05 crc kubenswrapper[4742]: I0317 11:34:05.481177 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24b880a5-c4dc-4566-80c3-13fddf078932","Type":"ContainerDied","Data":"8f9fd3cf05b1b274b0195ccc59bfc69550a213322d5dfbdf127ca7bc53d87c06"} Mar 17 11:34:05 crc kubenswrapper[4742]: I0317 11:34:05.584674 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 17 11:34:05 crc kubenswrapper[4742]: I0317 11:34:05.826359 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562454-s7z5z" Mar 17 11:34:05 crc kubenswrapper[4742]: I0317 11:34:05.941869 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drk7f\" (UniqueName: \"kubernetes.io/projected/bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df-kube-api-access-drk7f\") pod \"bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df\" (UID: \"bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df\") " Mar 17 11:34:05 crc kubenswrapper[4742]: I0317 11:34:05.947257 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df-kube-api-access-drk7f" (OuterVolumeSpecName: "kube-api-access-drk7f") pod "bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df" (UID: "bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df"). InnerVolumeSpecName "kube-api-access-drk7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:06 crc kubenswrapper[4742]: I0317 11:34:06.044215 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drk7f\" (UniqueName: \"kubernetes.io/projected/bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df-kube-api-access-drk7f\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:06 crc kubenswrapper[4742]: I0317 11:34:06.049923 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:06 crc kubenswrapper[4742]: I0317 11:34:06.473652 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562448-4pbmf"] Mar 17 11:34:06 crc kubenswrapper[4742]: I0317 11:34:06.485161 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562448-4pbmf"] Mar 17 11:34:06 crc kubenswrapper[4742]: I0317 11:34:06.493603 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562454-s7z5z" event={"ID":"bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df","Type":"ContainerDied","Data":"127ef21f250feed9ca6b7f3a401e2e002e77aaa98c3e7c6d311fc59c4b0c0789"} Mar 17 11:34:06 crc kubenswrapper[4742]: I0317 11:34:06.493646 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="127ef21f250feed9ca6b7f3a401e2e002e77aaa98c3e7c6d311fc59c4b0c0789" Mar 17 11:34:06 crc kubenswrapper[4742]: I0317 11:34:06.493707 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562454-s7z5z" Mar 17 11:34:06 crc kubenswrapper[4742]: I0317 11:34:06.495362 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdc48ac3-7501-4e63-9290-bff06909b045","Type":"ContainerStarted","Data":"4b105266e4b7755d4f3d52ea81801bc67f0fe85e8bbe59a38df91958a41fd784"} Mar 17 11:34:06 crc kubenswrapper[4742]: I0317 11:34:06.673023 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9d6abc-5eec-4122-9f35-665d934ff0ff" path="/var/lib/kubelet/pods/6a9d6abc-5eec-4122-9f35-665d934ff0ff/volumes" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.535421 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7","Type":"ContainerStarted","Data":"45d449aa9ee7b167df8df43706dee67784b89392cff27de48e66800a81202c1c"} Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.678394 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.775112 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-combined-ca-bundle\") pod \"24b880a5-c4dc-4566-80c3-13fddf078932\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.775275 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcqcp\" (UniqueName: \"kubernetes.io/projected/24b880a5-c4dc-4566-80c3-13fddf078932-kube-api-access-rcqcp\") pod \"24b880a5-c4dc-4566-80c3-13fddf078932\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.775302 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24b880a5-c4dc-4566-80c3-13fddf078932-httpd-run\") pod \"24b880a5-c4dc-4566-80c3-13fddf078932\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.775350 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-scripts\") pod \"24b880a5-c4dc-4566-80c3-13fddf078932\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.775376 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24b880a5-c4dc-4566-80c3-13fddf078932-logs\") pod \"24b880a5-c4dc-4566-80c3-13fddf078932\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.775425 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-config-data\") pod \"24b880a5-c4dc-4566-80c3-13fddf078932\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.775503 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-internal-tls-certs\") pod \"24b880a5-c4dc-4566-80c3-13fddf078932\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.775552 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"24b880a5-c4dc-4566-80c3-13fddf078932\" (UID: \"24b880a5-c4dc-4566-80c3-13fddf078932\") " Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.782233 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24b880a5-c4dc-4566-80c3-13fddf078932-logs" (OuterVolumeSpecName: "logs") pod "24b880a5-c4dc-4566-80c3-13fddf078932" (UID: "24b880a5-c4dc-4566-80c3-13fddf078932"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.785797 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24b880a5-c4dc-4566-80c3-13fddf078932-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "24b880a5-c4dc-4566-80c3-13fddf078932" (UID: "24b880a5-c4dc-4566-80c3-13fddf078932"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.786575 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-scripts" (OuterVolumeSpecName: "scripts") pod "24b880a5-c4dc-4566-80c3-13fddf078932" (UID: "24b880a5-c4dc-4566-80c3-13fddf078932"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.787060 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "24b880a5-c4dc-4566-80c3-13fddf078932" (UID: "24b880a5-c4dc-4566-80c3-13fddf078932"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.801151 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24b880a5-c4dc-4566-80c3-13fddf078932-kube-api-access-rcqcp" (OuterVolumeSpecName: "kube-api-access-rcqcp") pod "24b880a5-c4dc-4566-80c3-13fddf078932" (UID: "24b880a5-c4dc-4566-80c3-13fddf078932"). InnerVolumeSpecName "kube-api-access-rcqcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.831156 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24b880a5-c4dc-4566-80c3-13fddf078932" (UID: "24b880a5-c4dc-4566-80c3-13fddf078932"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.851171 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "24b880a5-c4dc-4566-80c3-13fddf078932" (UID: "24b880a5-c4dc-4566-80c3-13fddf078932"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.867972 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-config-data" (OuterVolumeSpecName: "config-data") pod "24b880a5-c4dc-4566-80c3-13fddf078932" (UID: "24b880a5-c4dc-4566-80c3-13fddf078932"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.877657 4742 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.877729 4742 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.877742 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.877754 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcqcp\" (UniqueName: \"kubernetes.io/projected/24b880a5-c4dc-4566-80c3-13fddf078932-kube-api-access-rcqcp\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.877771 4742 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24b880a5-c4dc-4566-80c3-13fddf078932-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.877781 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.877792 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24b880a5-c4dc-4566-80c3-13fddf078932-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.877803 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24b880a5-c4dc-4566-80c3-13fddf078932-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.911312 4742 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 17 11:34:07 crc kubenswrapper[4742]: I0317 11:34:07.979796 4742 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.549468 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.549463 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"24b880a5-c4dc-4566-80c3-13fddf078932","Type":"ContainerDied","Data":"e3e380aa34e43555ff3995abee65e5a83cc2a4dd6a3333c2f7d9c6f078af009d"} Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.551091 4742 scope.go:117] "RemoveContainer" containerID="8f9fd3cf05b1b274b0195ccc59bfc69550a213322d5dfbdf127ca7bc53d87c06" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.553738 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdc48ac3-7501-4e63-9290-bff06909b045","Type":"ContainerStarted","Data":"6223081678bd0af04b5c4390c3c2613a5358dbd8e558e380f4bed1c30831def1"} Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.579294 4742 scope.go:117] "RemoveContainer" containerID="21dfba13d6ea06b5e5b7ca6e464fcedbd33bca247f133278945a35a27374bda5" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.586224 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.596229 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.606412 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:34:08 crc kubenswrapper[4742]: E0317 11:34:08.606844 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b880a5-c4dc-4566-80c3-13fddf078932" containerName="glance-httpd" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.606869 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b880a5-c4dc-4566-80c3-13fddf078932" containerName="glance-httpd" Mar 17 11:34:08 crc kubenswrapper[4742]: E0317 11:34:08.606879 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df" containerName="oc" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.606886 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df" containerName="oc" Mar 17 11:34:08 crc kubenswrapper[4742]: E0317 11:34:08.606928 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b880a5-c4dc-4566-80c3-13fddf078932" containerName="glance-log" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.606938 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b880a5-c4dc-4566-80c3-13fddf078932" containerName="glance-log" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.607177 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b880a5-c4dc-4566-80c3-13fddf078932" containerName="glance-log" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.607203 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df" containerName="oc" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.607558 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b880a5-c4dc-4566-80c3-13fddf078932" containerName="glance-httpd" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.608791 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.611806 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.613147 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.618106 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.691926 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c030ab26-9079-49cf-837f-c0625cfe6cc3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.691977 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c030ab26-9079-49cf-837f-c0625cfe6cc3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.692017 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c030ab26-9079-49cf-837f-c0625cfe6cc3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.692054 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c030ab26-9079-49cf-837f-c0625cfe6cc3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.692133 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vrgv\" (UniqueName: \"kubernetes.io/projected/c030ab26-9079-49cf-837f-c0625cfe6cc3-kube-api-access-4vrgv\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.692354 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.692397 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c030ab26-9079-49cf-837f-c0625cfe6cc3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.692526 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c030ab26-9079-49cf-837f-c0625cfe6cc3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.699877 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b880a5-c4dc-4566-80c3-13fddf078932" path="/var/lib/kubelet/pods/24b880a5-c4dc-4566-80c3-13fddf078932/volumes" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.793858 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c030ab26-9079-49cf-837f-c0625cfe6cc3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.793935 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c030ab26-9079-49cf-837f-c0625cfe6cc3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.793971 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c030ab26-9079-49cf-837f-c0625cfe6cc3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.794037 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vrgv\" (UniqueName: \"kubernetes.io/projected/c030ab26-9079-49cf-837f-c0625cfe6cc3-kube-api-access-4vrgv\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.794084 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c030ab26-9079-49cf-837f-c0625cfe6cc3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.794101 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.794158 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c030ab26-9079-49cf-837f-c0625cfe6cc3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.794211 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c030ab26-9079-49cf-837f-c0625cfe6cc3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.794369 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.794721 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c030ab26-9079-49cf-837f-c0625cfe6cc3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.797294 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c030ab26-9079-49cf-837f-c0625cfe6cc3-logs\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.799120 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c030ab26-9079-49cf-837f-c0625cfe6cc3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.799309 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c030ab26-9079-49cf-837f-c0625cfe6cc3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.800180 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c030ab26-9079-49cf-837f-c0625cfe6cc3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.802893 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c030ab26-9079-49cf-837f-c0625cfe6cc3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.816676 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vrgv\" (UniqueName: \"kubernetes.io/projected/c030ab26-9079-49cf-837f-c0625cfe6cc3-kube-api-access-4vrgv\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.830244 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c030ab26-9079-49cf-837f-c0625cfe6cc3\") " pod="openstack/glance-default-internal-api-0" Mar 17 11:34:08 crc kubenswrapper[4742]: I0317 11:34:08.943132 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 17 11:34:09 crc kubenswrapper[4742]: I0317 11:34:09.497099 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 17 11:34:09 crc kubenswrapper[4742]: I0317 11:34:09.566349 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fdc48ac3-7501-4e63-9290-bff06909b045","Type":"ContainerStarted","Data":"0e38a727cb9fba5ae9e90ddc5b7f5a745bf2970d89032c217f2f0f895684eb8e"} Mar 17 11:34:09 crc kubenswrapper[4742]: I0317 11:34:09.569568 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7","Type":"ContainerStarted","Data":"2bdadf9f657a7837685e3d22e831c9dc1a307bdd950859bd29d4182df326be61"} Mar 17 11:34:09 crc kubenswrapper[4742]: I0317 11:34:09.569609 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7","Type":"ContainerStarted","Data":"9e2ef94e5f0904e8e9282e4098ab63fc250743bd353d378a43858f10316cf498"} Mar 17 11:34:09 crc kubenswrapper[4742]: I0317 11:34:09.570550 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c030ab26-9079-49cf-837f-c0625cfe6cc3","Type":"ContainerStarted","Data":"af14d56e5c49a7f95d0856bd5c56f9058023579990aa510e2764e9ca0d1b1586"} Mar 17 11:34:09 crc kubenswrapper[4742]: I0317 11:34:09.604282 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.604262299 podStartE2EDuration="5.604262299s" podCreationTimestamp="2026-03-17 11:34:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:34:09.60072049 +0000 UTC m=+1352.726848268" watchObservedRunningTime="2026-03-17 11:34:09.604262299 +0000 UTC m=+1352.730390057" Mar 17 11:34:10 crc kubenswrapper[4742]: I0317 11:34:10.582291 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c030ab26-9079-49cf-837f-c0625cfe6cc3","Type":"ContainerStarted","Data":"ca53994beac1ee4a3464bbea741956151d411a91e41096d0267329e648027589"} Mar 17 11:34:11 crc kubenswrapper[4742]: I0317 11:34:11.595455 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7","Type":"ContainerStarted","Data":"bb3448da10fb23198f6f57b9fbdad5ae2dfd1b6fa2d3c740742f7cd97f9748f1"} Mar 17 11:34:11 crc kubenswrapper[4742]: I0317 11:34:11.595873 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 11:34:11 crc kubenswrapper[4742]: I0317 11:34:11.595594 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="sg-core" containerID="cri-o://2bdadf9f657a7837685e3d22e831c9dc1a307bdd950859bd29d4182df326be61" gracePeriod=30 Mar 17 11:34:11 crc kubenswrapper[4742]: I0317 11:34:11.595532 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="ceilometer-central-agent" containerID="cri-o://45d449aa9ee7b167df8df43706dee67784b89392cff27de48e66800a81202c1c" gracePeriod=30 Mar 17 11:34:11 crc kubenswrapper[4742]: I0317 11:34:11.595637 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="ceilometer-notification-agent" containerID="cri-o://9e2ef94e5f0904e8e9282e4098ab63fc250743bd353d378a43858f10316cf498" gracePeriod=30 Mar 17 11:34:11 crc kubenswrapper[4742]: I0317 11:34:11.595663 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="proxy-httpd" containerID="cri-o://bb3448da10fb23198f6f57b9fbdad5ae2dfd1b6fa2d3c740742f7cd97f9748f1" gracePeriod=30 Mar 17 11:34:11 crc kubenswrapper[4742]: I0317 11:34:11.598465 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c030ab26-9079-49cf-837f-c0625cfe6cc3","Type":"ContainerStarted","Data":"e38e04b30114e93f16772afa07a55ed2b63e7ac8c83e2ba1a0119d918018fe2d"} Mar 17 11:34:11 crc kubenswrapper[4742]: I0317 11:34:11.628869 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.249050258 podStartE2EDuration="8.628855103s" podCreationTimestamp="2026-03-17 11:34:03 +0000 UTC" firstStartedPulling="2026-03-17 11:34:04.711227377 +0000 UTC m=+1347.837355135" lastFinishedPulling="2026-03-17 11:34:11.091032222 +0000 UTC m=+1354.217159980" observedRunningTime="2026-03-17 11:34:11.623643917 +0000 UTC m=+1354.749771685" watchObservedRunningTime="2026-03-17 11:34:11.628855103 +0000 UTC m=+1354.754982861" Mar 17 11:34:11 crc kubenswrapper[4742]: I0317 11:34:11.657924 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.6578929110000002 podStartE2EDuration="3.657892911s" podCreationTimestamp="2026-03-17 11:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:34:11.64385783 +0000 UTC m=+1354.769985598" watchObservedRunningTime="2026-03-17 11:34:11.657892911 +0000 UTC m=+1354.784020669" Mar 17 11:34:12 crc kubenswrapper[4742]: I0317 11:34:12.612433 4742 generic.go:334] "Generic (PLEG): container finished" podID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerID="bb3448da10fb23198f6f57b9fbdad5ae2dfd1b6fa2d3c740742f7cd97f9748f1" exitCode=0 Mar 17 11:34:12 crc kubenswrapper[4742]: I0317 11:34:12.612470 4742 generic.go:334] "Generic (PLEG): container finished" podID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerID="2bdadf9f657a7837685e3d22e831c9dc1a307bdd950859bd29d4182df326be61" exitCode=2 Mar 17 11:34:12 crc kubenswrapper[4742]: I0317 11:34:12.612479 4742 generic.go:334] "Generic (PLEG): container finished" podID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerID="9e2ef94e5f0904e8e9282e4098ab63fc250743bd353d378a43858f10316cf498" exitCode=0 Mar 17 11:34:12 crc kubenswrapper[4742]: I0317 11:34:12.612512 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7","Type":"ContainerDied","Data":"bb3448da10fb23198f6f57b9fbdad5ae2dfd1b6fa2d3c740742f7cd97f9748f1"} Mar 17 11:34:12 crc kubenswrapper[4742]: I0317 11:34:12.612555 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7","Type":"ContainerDied","Data":"2bdadf9f657a7837685e3d22e831c9dc1a307bdd950859bd29d4182df326be61"} Mar 17 11:34:12 crc kubenswrapper[4742]: I0317 11:34:12.612570 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7","Type":"ContainerDied","Data":"9e2ef94e5f0904e8e9282e4098ab63fc250743bd353d378a43858f10316cf498"} Mar 17 11:34:13 crc kubenswrapper[4742]: I0317 11:34:13.632193 4742 generic.go:334] "Generic (PLEG): container finished" podID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerID="45d449aa9ee7b167df8df43706dee67784b89392cff27de48e66800a81202c1c" exitCode=0 Mar 17 11:34:13 crc kubenswrapper[4742]: I0317 11:34:13.632260 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7","Type":"ContainerDied","Data":"45d449aa9ee7b167df8df43706dee67784b89392cff27de48e66800a81202c1c"} Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.009922 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.096384 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-log-httpd\") pod \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.096438 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-config-data\") pod \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.096543 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-run-httpd\") pod \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.096566 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-combined-ca-bundle\") pod \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.096595 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-sg-core-conf-yaml\") pod \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.096619 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cmk5\" (UniqueName: \"kubernetes.io/projected/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-kube-api-access-2cmk5\") pod \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.096686 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-scripts\") pod \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\" (UID: \"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7\") " Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.097657 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" (UID: "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.097853 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" (UID: "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.102340 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-scripts" (OuterVolumeSpecName: "scripts") pod "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" (UID: "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.104182 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-kube-api-access-2cmk5" (OuterVolumeSpecName: "kube-api-access-2cmk5") pod "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" (UID: "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7"). InnerVolumeSpecName "kube-api-access-2cmk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.124650 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" (UID: "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.172663 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" (UID: "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.200267 4742 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.200332 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.200352 4742 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.200396 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cmk5\" (UniqueName: \"kubernetes.io/projected/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-kube-api-access-2cmk5\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.200406 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.200418 4742 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.226433 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-config-data" (OuterVolumeSpecName: "config-data") pod "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" (UID: "d9f64e9a-793a-4980-b5c6-8ca6c6c495e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.302215 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.645355 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f64e9a-793a-4980-b5c6-8ca6c6c495e7","Type":"ContainerDied","Data":"b445c489d60d0b5764154a19772023ad4ce9eac67f8f5fb7851f7a2f499cd05c"} Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.645394 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.645458 4742 scope.go:117] "RemoveContainer" containerID="bb3448da10fb23198f6f57b9fbdad5ae2dfd1b6fa2d3c740742f7cd97f9748f1" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.647354 4742 generic.go:334] "Generic (PLEG): container finished" podID="e15fe5ee-73d7-415a-a61c-a0e67d085f3a" containerID="49c102ea2f25979bc529898327d689137acdb1c3e0ef759e8b4da71e4736f9aa" exitCode=0 Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.647398 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bq6vr" event={"ID":"e15fe5ee-73d7-415a-a61c-a0e67d085f3a","Type":"ContainerDied","Data":"49c102ea2f25979bc529898327d689137acdb1c3e0ef759e8b4da71e4736f9aa"} Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.699370 4742 scope.go:117] "RemoveContainer" containerID="2bdadf9f657a7837685e3d22e831c9dc1a307bdd950859bd29d4182df326be61" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.716595 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.731405 4742 scope.go:117] "RemoveContainer" containerID="9e2ef94e5f0904e8e9282e4098ab63fc250743bd353d378a43858f10316cf498" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.752504 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.762474 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:14 crc kubenswrapper[4742]: E0317 11:34:14.762884 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="proxy-httpd" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.762915 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="proxy-httpd" Mar 17 11:34:14 crc kubenswrapper[4742]: E0317 11:34:14.762939 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="ceilometer-notification-agent" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.762955 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="ceilometer-notification-agent" Mar 17 11:34:14 crc kubenswrapper[4742]: E0317 11:34:14.762966 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="sg-core" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.762972 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="sg-core" Mar 17 11:34:14 crc kubenswrapper[4742]: E0317 11:34:14.762993 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="ceilometer-central-agent" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.762998 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="ceilometer-central-agent" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.763192 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="proxy-httpd" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.763207 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="ceilometer-notification-agent" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.763214 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="ceilometer-central-agent" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.763225 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" containerName="sg-core" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.764433 4742 scope.go:117] "RemoveContainer" containerID="45d449aa9ee7b167df8df43706dee67784b89392cff27de48e66800a81202c1c" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.764892 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.769845 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.770636 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.790074 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.819730 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwqxf\" (UniqueName: \"kubernetes.io/projected/7436ccc7-650f-4c72-8424-b68258770217-kube-api-access-cwqxf\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.819781 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-config-data\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.819806 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7436ccc7-650f-4c72-8424-b68258770217-log-httpd\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.819925 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.819950 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7436ccc7-650f-4c72-8424-b68258770217-run-httpd\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.819989 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-scripts\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.820021 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.921975 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-config-data\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.922019 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7436ccc7-650f-4c72-8424-b68258770217-log-httpd\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.922071 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.922090 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7436ccc7-650f-4c72-8424-b68258770217-run-httpd\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.922123 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-scripts\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.922148 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.922262 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwqxf\" (UniqueName: \"kubernetes.io/projected/7436ccc7-650f-4c72-8424-b68258770217-kube-api-access-cwqxf\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.923730 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7436ccc7-650f-4c72-8424-b68258770217-run-httpd\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.924210 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7436ccc7-650f-4c72-8424-b68258770217-log-httpd\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.927741 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-scripts\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.928885 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.937602 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.939575 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwqxf\" (UniqueName: \"kubernetes.io/projected/7436ccc7-650f-4c72-8424-b68258770217-kube-api-access-cwqxf\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.945537 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-config-data\") pod \"ceilometer-0\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " pod="openstack/ceilometer-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.957391 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.957436 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.999535 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 17 11:34:14 crc kubenswrapper[4742]: I0317 11:34:14.999624 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 17 11:34:15 crc kubenswrapper[4742]: I0317 11:34:15.095184 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:34:15 crc kubenswrapper[4742]: I0317 11:34:15.536885 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:15 crc kubenswrapper[4742]: W0317 11:34:15.555169 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7436ccc7_650f_4c72_8424_b68258770217.slice/crio-0ad2f7c2eac4542e0ee2c6539f1a1ce319b311e694fcefdc33621b362c397835 WatchSource:0}: Error finding container 0ad2f7c2eac4542e0ee2c6539f1a1ce319b311e694fcefdc33621b362c397835: Status 404 returned error can't find the container with id 0ad2f7c2eac4542e0ee2c6539f1a1ce319b311e694fcefdc33621b362c397835 Mar 17 11:34:15 crc kubenswrapper[4742]: I0317 11:34:15.658401 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7436ccc7-650f-4c72-8424-b68258770217","Type":"ContainerStarted","Data":"0ad2f7c2eac4542e0ee2c6539f1a1ce319b311e694fcefdc33621b362c397835"} Mar 17 11:34:15 crc kubenswrapper[4742]: I0317 11:34:15.661427 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 17 11:34:15 crc kubenswrapper[4742]: I0317 11:34:15.661455 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.091099 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.149530 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-config-data\") pod \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.149707 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-combined-ca-bundle\") pod \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.149858 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2f8w\" (UniqueName: \"kubernetes.io/projected/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-kube-api-access-g2f8w\") pod \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.149890 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-scripts\") pod \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\" (UID: \"e15fe5ee-73d7-415a-a61c-a0e67d085f3a\") " Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.154744 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-scripts" (OuterVolumeSpecName: "scripts") pod "e15fe5ee-73d7-415a-a61c-a0e67d085f3a" (UID: "e15fe5ee-73d7-415a-a61c-a0e67d085f3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.156729 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-kube-api-access-g2f8w" (OuterVolumeSpecName: "kube-api-access-g2f8w") pod "e15fe5ee-73d7-415a-a61c-a0e67d085f3a" (UID: "e15fe5ee-73d7-415a-a61c-a0e67d085f3a"). InnerVolumeSpecName "kube-api-access-g2f8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.180809 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e15fe5ee-73d7-415a-a61c-a0e67d085f3a" (UID: "e15fe5ee-73d7-415a-a61c-a0e67d085f3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.188044 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-config-data" (OuterVolumeSpecName: "config-data") pod "e15fe5ee-73d7-415a-a61c-a0e67d085f3a" (UID: "e15fe5ee-73d7-415a-a61c-a0e67d085f3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.251852 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.251890 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2f8w\" (UniqueName: \"kubernetes.io/projected/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-kube-api-access-g2f8w\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.251928 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.251944 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e15fe5ee-73d7-415a-a61c-a0e67d085f3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.702323 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-bq6vr" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.711933 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9f64e9a-793a-4980-b5c6-8ca6c6c495e7" path="/var/lib/kubelet/pods/d9f64e9a-793a-4980-b5c6-8ca6c6c495e7/volumes" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.714324 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-bq6vr" event={"ID":"e15fe5ee-73d7-415a-a61c-a0e67d085f3a","Type":"ContainerDied","Data":"3175413b26ce181e380f91de13370c43edda132ccc71b62920fb43e84d575e12"} Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.714361 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3175413b26ce181e380f91de13370c43edda132ccc71b62920fb43e84d575e12" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.714376 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7436ccc7-650f-4c72-8424-b68258770217","Type":"ContainerStarted","Data":"6cdf3fbc57c84fd18359084ed0db93da1fac58dcd110f0fdeb978202cd2f9552"} Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.976965 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 11:34:16 crc kubenswrapper[4742]: E0317 11:34:16.977768 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e15fe5ee-73d7-415a-a61c-a0e67d085f3a" containerName="nova-cell0-conductor-db-sync" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.977848 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e15fe5ee-73d7-415a-a61c-a0e67d085f3a" containerName="nova-cell0-conductor-db-sync" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.978124 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="e15fe5ee-73d7-415a-a61c-a0e67d085f3a" containerName="nova-cell0-conductor-db-sync" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.978773 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.980766 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4tjcv" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.981361 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 17 11:34:16 crc kubenswrapper[4742]: I0317 11:34:16.994544 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.075870 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1ec33d-f957-48d9-9284-682dc28a3f09-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"db1ec33d-f957-48d9-9284-682dc28a3f09\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.076057 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1ec33d-f957-48d9-9284-682dc28a3f09-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"db1ec33d-f957-48d9-9284-682dc28a3f09\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.076104 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb79g\" (UniqueName: \"kubernetes.io/projected/db1ec33d-f957-48d9-9284-682dc28a3f09-kube-api-access-jb79g\") pod \"nova-cell0-conductor-0\" (UID: \"db1ec33d-f957-48d9-9284-682dc28a3f09\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.179501 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1ec33d-f957-48d9-9284-682dc28a3f09-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"db1ec33d-f957-48d9-9284-682dc28a3f09\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.179564 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb79g\" (UniqueName: \"kubernetes.io/projected/db1ec33d-f957-48d9-9284-682dc28a3f09-kube-api-access-jb79g\") pod \"nova-cell0-conductor-0\" (UID: \"db1ec33d-f957-48d9-9284-682dc28a3f09\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.179690 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1ec33d-f957-48d9-9284-682dc28a3f09-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"db1ec33d-f957-48d9-9284-682dc28a3f09\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.185504 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1ec33d-f957-48d9-9284-682dc28a3f09-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"db1ec33d-f957-48d9-9284-682dc28a3f09\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.185645 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1ec33d-f957-48d9-9284-682dc28a3f09-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"db1ec33d-f957-48d9-9284-682dc28a3f09\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.200176 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb79g\" (UniqueName: \"kubernetes.io/projected/db1ec33d-f957-48d9-9284-682dc28a3f09-kube-api-access-jb79g\") pod \"nova-cell0-conductor-0\" (UID: \"db1ec33d-f957-48d9-9284-682dc28a3f09\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.295674 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.698348 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.723528 4742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.724142 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7436ccc7-650f-4c72-8424-b68258770217","Type":"ContainerStarted","Data":"e68da5f22c628b7c48c32eef77a3314500ef7524f09c7537f81b336ea1fda4fd"} Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.724206 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7436ccc7-650f-4c72-8424-b68258770217","Type":"ContainerStarted","Data":"098897db03562c7689f259753d4df593dfbd96b8685fec72881e0fbd60268b8d"} Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.742409 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 11:34:17 crc kubenswrapper[4742]: W0317 11:34:17.745692 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb1ec33d_f957_48d9_9284_682dc28a3f09.slice/crio-c7b1b49ffe828d1b8d984d6d67901cda25a1e519c209fe48c536ac1abb211af3 WatchSource:0}: Error finding container c7b1b49ffe828d1b8d984d6d67901cda25a1e519c209fe48c536ac1abb211af3: Status 404 returned error can't find the container with id c7b1b49ffe828d1b8d984d6d67901cda25a1e519c209fe48c536ac1abb211af3 Mar 17 11:34:17 crc kubenswrapper[4742]: I0317 11:34:17.810521 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 17 11:34:18 crc kubenswrapper[4742]: I0317 11:34:18.044642 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:34:18 crc kubenswrapper[4742]: I0317 11:34:18.044703 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:34:18 crc kubenswrapper[4742]: I0317 11:34:18.735413 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"db1ec33d-f957-48d9-9284-682dc28a3f09","Type":"ContainerStarted","Data":"e9ea045617080df7a23906a3524c3d38b58b57ae4361803d875fd610a01afe54"} Mar 17 11:34:18 crc kubenswrapper[4742]: I0317 11:34:18.735820 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:18 crc kubenswrapper[4742]: I0317 11:34:18.735839 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"db1ec33d-f957-48d9-9284-682dc28a3f09","Type":"ContainerStarted","Data":"c7b1b49ffe828d1b8d984d6d67901cda25a1e519c209fe48c536ac1abb211af3"} Mar 17 11:34:18 crc kubenswrapper[4742]: I0317 11:34:18.788476 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.788458963 podStartE2EDuration="2.788458963s" podCreationTimestamp="2026-03-17 11:34:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:34:18.781958152 +0000 UTC m=+1361.908085920" watchObservedRunningTime="2026-03-17 11:34:18.788458963 +0000 UTC m=+1361.914586721" Mar 17 11:34:18 crc kubenswrapper[4742]: I0317 11:34:18.944312 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 17 11:34:18 crc kubenswrapper[4742]: I0317 11:34:18.944378 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 17 11:34:18 crc kubenswrapper[4742]: I0317 11:34:18.988401 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 17 11:34:18 crc kubenswrapper[4742]: I0317 11:34:18.991133 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 17 11:34:19 crc kubenswrapper[4742]: I0317 11:34:19.301406 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 11:34:19 crc kubenswrapper[4742]: I0317 11:34:19.745106 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 17 11:34:19 crc kubenswrapper[4742]: I0317 11:34:19.745156 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 17 11:34:20 crc kubenswrapper[4742]: I0317 11:34:20.756009 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7436ccc7-650f-4c72-8424-b68258770217","Type":"ContainerStarted","Data":"ed69d7748b71bae81f1a303fa0b9b3022833052a6a30694b653e181bd12ed9e0"} Mar 17 11:34:20 crc kubenswrapper[4742]: I0317 11:34:20.756783 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="db1ec33d-f957-48d9-9284-682dc28a3f09" containerName="nova-cell0-conductor-conductor" containerID="cri-o://e9ea045617080df7a23906a3524c3d38b58b57ae4361803d875fd610a01afe54" gracePeriod=30 Mar 17 11:34:20 crc kubenswrapper[4742]: I0317 11:34:20.790619 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.692341302 podStartE2EDuration="6.790603945s" podCreationTimestamp="2026-03-17 11:34:14 +0000 UTC" firstStartedPulling="2026-03-17 11:34:15.558605468 +0000 UTC m=+1358.684733236" lastFinishedPulling="2026-03-17 11:34:19.656868121 +0000 UTC m=+1362.782995879" observedRunningTime="2026-03-17 11:34:20.787810697 +0000 UTC m=+1363.913938455" watchObservedRunningTime="2026-03-17 11:34:20.790603945 +0000 UTC m=+1363.916731703" Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.507394 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.766689 4742 generic.go:334] "Generic (PLEG): container finished" podID="db1ec33d-f957-48d9-9284-682dc28a3f09" containerID="e9ea045617080df7a23906a3524c3d38b58b57ae4361803d875fd610a01afe54" exitCode=0 Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.766791 4742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.766805 4742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.767550 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"db1ec33d-f957-48d9-9284-682dc28a3f09","Type":"ContainerDied","Data":"e9ea045617080df7a23906a3524c3d38b58b57ae4361803d875fd610a01afe54"} Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.767582 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"db1ec33d-f957-48d9-9284-682dc28a3f09","Type":"ContainerDied","Data":"c7b1b49ffe828d1b8d984d6d67901cda25a1e519c209fe48c536ac1abb211af3"} Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.767600 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7b1b49ffe828d1b8d984d6d67901cda25a1e519c209fe48c536ac1abb211af3" Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.768372 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.787361 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.788382 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.939112 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.966924 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1ec33d-f957-48d9-9284-682dc28a3f09-config-data\") pod \"db1ec33d-f957-48d9-9284-682dc28a3f09\" (UID: \"db1ec33d-f957-48d9-9284-682dc28a3f09\") " Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.966959 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb79g\" (UniqueName: \"kubernetes.io/projected/db1ec33d-f957-48d9-9284-682dc28a3f09-kube-api-access-jb79g\") pod \"db1ec33d-f957-48d9-9284-682dc28a3f09\" (UID: \"db1ec33d-f957-48d9-9284-682dc28a3f09\") " Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.967018 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1ec33d-f957-48d9-9284-682dc28a3f09-combined-ca-bundle\") pod \"db1ec33d-f957-48d9-9284-682dc28a3f09\" (UID: \"db1ec33d-f957-48d9-9284-682dc28a3f09\") " Mar 17 11:34:21 crc kubenswrapper[4742]: I0317 11:34:21.973167 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db1ec33d-f957-48d9-9284-682dc28a3f09-kube-api-access-jb79g" (OuterVolumeSpecName: "kube-api-access-jb79g") pod "db1ec33d-f957-48d9-9284-682dc28a3f09" (UID: "db1ec33d-f957-48d9-9284-682dc28a3f09"). InnerVolumeSpecName "kube-api-access-jb79g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.014715 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1ec33d-f957-48d9-9284-682dc28a3f09-config-data" (OuterVolumeSpecName: "config-data") pod "db1ec33d-f957-48d9-9284-682dc28a3f09" (UID: "db1ec33d-f957-48d9-9284-682dc28a3f09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.033934 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1ec33d-f957-48d9-9284-682dc28a3f09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db1ec33d-f957-48d9-9284-682dc28a3f09" (UID: "db1ec33d-f957-48d9-9284-682dc28a3f09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.069485 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1ec33d-f957-48d9-9284-682dc28a3f09-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.069529 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb79g\" (UniqueName: \"kubernetes.io/projected/db1ec33d-f957-48d9-9284-682dc28a3f09-kube-api-access-jb79g\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.069548 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1ec33d-f957-48d9-9284-682dc28a3f09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.775810 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.776523 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="sg-core" containerID="cri-o://e68da5f22c628b7c48c32eef77a3314500ef7524f09c7537f81b336ea1fda4fd" gracePeriod=30 Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.776572 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="ceilometer-notification-agent" containerID="cri-o://098897db03562c7689f259753d4df593dfbd96b8685fec72881e0fbd60268b8d" gracePeriod=30 Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.776541 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="proxy-httpd" containerID="cri-o://ed69d7748b71bae81f1a303fa0b9b3022833052a6a30694b653e181bd12ed9e0" gracePeriod=30 Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.778386 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="ceilometer-central-agent" containerID="cri-o://6cdf3fbc57c84fd18359084ed0db93da1fac58dcd110f0fdeb978202cd2f9552" gracePeriod=30 Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.816138 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.829222 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.845131 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 11:34:22 crc kubenswrapper[4742]: E0317 11:34:22.845491 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1ec33d-f957-48d9-9284-682dc28a3f09" containerName="nova-cell0-conductor-conductor" Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.845506 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1ec33d-f957-48d9-9284-682dc28a3f09" containerName="nova-cell0-conductor-conductor" Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.845648 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1ec33d-f957-48d9-9284-682dc28a3f09" containerName="nova-cell0-conductor-conductor" Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.846256 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.848411 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4tjcv" Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.848765 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 17 11:34:22 crc kubenswrapper[4742]: I0317 11:34:22.873892 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.036724 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a50429-d785-408b-b53f-fef4700692c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"32a50429-d785-408b-b53f-fef4700692c6\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.036788 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpdrg\" (UniqueName: \"kubernetes.io/projected/32a50429-d785-408b-b53f-fef4700692c6-kube-api-access-wpdrg\") pod \"nova-cell0-conductor-0\" (UID: \"32a50429-d785-408b-b53f-fef4700692c6\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.036886 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a50429-d785-408b-b53f-fef4700692c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"32a50429-d785-408b-b53f-fef4700692c6\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.138997 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a50429-d785-408b-b53f-fef4700692c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"32a50429-d785-408b-b53f-fef4700692c6\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.139045 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpdrg\" (UniqueName: \"kubernetes.io/projected/32a50429-d785-408b-b53f-fef4700692c6-kube-api-access-wpdrg\") pod \"nova-cell0-conductor-0\" (UID: \"32a50429-d785-408b-b53f-fef4700692c6\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.139091 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a50429-d785-408b-b53f-fef4700692c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"32a50429-d785-408b-b53f-fef4700692c6\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.148515 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a50429-d785-408b-b53f-fef4700692c6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"32a50429-d785-408b-b53f-fef4700692c6\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.148714 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32a50429-d785-408b-b53f-fef4700692c6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"32a50429-d785-408b-b53f-fef4700692c6\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.167244 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpdrg\" (UniqueName: \"kubernetes.io/projected/32a50429-d785-408b-b53f-fef4700692c6-kube-api-access-wpdrg\") pod \"nova-cell0-conductor-0\" (UID: \"32a50429-d785-408b-b53f-fef4700692c6\") " pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.462288 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.796931 4742 generic.go:334] "Generic (PLEG): container finished" podID="7436ccc7-650f-4c72-8424-b68258770217" containerID="ed69d7748b71bae81f1a303fa0b9b3022833052a6a30694b653e181bd12ed9e0" exitCode=0 Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.797299 4742 generic.go:334] "Generic (PLEG): container finished" podID="7436ccc7-650f-4c72-8424-b68258770217" containerID="e68da5f22c628b7c48c32eef77a3314500ef7524f09c7537f81b336ea1fda4fd" exitCode=2 Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.797311 4742 generic.go:334] "Generic (PLEG): container finished" podID="7436ccc7-650f-4c72-8424-b68258770217" containerID="098897db03562c7689f259753d4df593dfbd96b8685fec72881e0fbd60268b8d" exitCode=0 Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.797019 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7436ccc7-650f-4c72-8424-b68258770217","Type":"ContainerDied","Data":"ed69d7748b71bae81f1a303fa0b9b3022833052a6a30694b653e181bd12ed9e0"} Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.797441 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7436ccc7-650f-4c72-8424-b68258770217","Type":"ContainerDied","Data":"e68da5f22c628b7c48c32eef77a3314500ef7524f09c7537f81b336ea1fda4fd"} Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.797495 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7436ccc7-650f-4c72-8424-b68258770217","Type":"ContainerDied","Data":"098897db03562c7689f259753d4df593dfbd96b8685fec72881e0fbd60268b8d"} Mar 17 11:34:23 crc kubenswrapper[4742]: I0317 11:34:23.947106 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.672648 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db1ec33d-f957-48d9-9284-682dc28a3f09" path="/var/lib/kubelet/pods/db1ec33d-f957-48d9-9284-682dc28a3f09/volumes" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.674014 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.766448 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7436ccc7-650f-4c72-8424-b68258770217-log-httpd\") pod \"7436ccc7-650f-4c72-8424-b68258770217\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.766537 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwqxf\" (UniqueName: \"kubernetes.io/projected/7436ccc7-650f-4c72-8424-b68258770217-kube-api-access-cwqxf\") pod \"7436ccc7-650f-4c72-8424-b68258770217\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.766570 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-combined-ca-bundle\") pod \"7436ccc7-650f-4c72-8424-b68258770217\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.766587 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7436ccc7-650f-4c72-8424-b68258770217-run-httpd\") pod \"7436ccc7-650f-4c72-8424-b68258770217\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.766725 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-sg-core-conf-yaml\") pod \"7436ccc7-650f-4c72-8424-b68258770217\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.766751 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-scripts\") pod \"7436ccc7-650f-4c72-8424-b68258770217\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.766803 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-config-data\") pod \"7436ccc7-650f-4c72-8424-b68258770217\" (UID: \"7436ccc7-650f-4c72-8424-b68258770217\") " Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.767098 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7436ccc7-650f-4c72-8424-b68258770217-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7436ccc7-650f-4c72-8424-b68258770217" (UID: "7436ccc7-650f-4c72-8424-b68258770217"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.767287 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7436ccc7-650f-4c72-8424-b68258770217-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7436ccc7-650f-4c72-8424-b68258770217" (UID: "7436ccc7-650f-4c72-8424-b68258770217"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.776120 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7436ccc7-650f-4c72-8424-b68258770217-kube-api-access-cwqxf" (OuterVolumeSpecName: "kube-api-access-cwqxf") pod "7436ccc7-650f-4c72-8424-b68258770217" (UID: "7436ccc7-650f-4c72-8424-b68258770217"). InnerVolumeSpecName "kube-api-access-cwqxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.777607 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-scripts" (OuterVolumeSpecName: "scripts") pod "7436ccc7-650f-4c72-8424-b68258770217" (UID: "7436ccc7-650f-4c72-8424-b68258770217"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.807068 4742 generic.go:334] "Generic (PLEG): container finished" podID="7436ccc7-650f-4c72-8424-b68258770217" containerID="6cdf3fbc57c84fd18359084ed0db93da1fac58dcd110f0fdeb978202cd2f9552" exitCode=0 Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.807122 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.807182 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7436ccc7-650f-4c72-8424-b68258770217","Type":"ContainerDied","Data":"6cdf3fbc57c84fd18359084ed0db93da1fac58dcd110f0fdeb978202cd2f9552"} Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.807225 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7436ccc7-650f-4c72-8424-b68258770217","Type":"ContainerDied","Data":"0ad2f7c2eac4542e0ee2c6539f1a1ce319b311e694fcefdc33621b362c397835"} Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.807255 4742 scope.go:117] "RemoveContainer" containerID="ed69d7748b71bae81f1a303fa0b9b3022833052a6a30694b653e181bd12ed9e0" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.810525 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"32a50429-d785-408b-b53f-fef4700692c6","Type":"ContainerStarted","Data":"af7197f891f3b36fbd236209a5cd31f9c7ea56cc26a11cbb6ff59a7042e13fcf"} Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.810573 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"32a50429-d785-408b-b53f-fef4700692c6","Type":"ContainerStarted","Data":"0c254fc644afdf07608c13182c420e5d8c5f89fc719dd586d5a4d6cb8dcd301c"} Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.811208 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.860444 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7436ccc7-650f-4c72-8424-b68258770217" (UID: "7436ccc7-650f-4c72-8424-b68258770217"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.867419 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.867389219 podStartE2EDuration="2.867389219s" podCreationTimestamp="2026-03-17 11:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:34:24.843214144 +0000 UTC m=+1367.969341902" watchObservedRunningTime="2026-03-17 11:34:24.867389219 +0000 UTC m=+1367.993516987" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.869329 4742 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7436ccc7-650f-4c72-8424-b68258770217-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.869415 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwqxf\" (UniqueName: \"kubernetes.io/projected/7436ccc7-650f-4c72-8424-b68258770217-kube-api-access-cwqxf\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.869472 4742 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7436ccc7-650f-4c72-8424-b68258770217-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.869551 4742 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.869620 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.901192 4742 scope.go:117] "RemoveContainer" containerID="e68da5f22c628b7c48c32eef77a3314500ef7524f09c7537f81b336ea1fda4fd" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.967371 4742 scope.go:117] "RemoveContainer" containerID="098897db03562c7689f259753d4df593dfbd96b8685fec72881e0fbd60268b8d" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.970038 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7436ccc7-650f-4c72-8424-b68258770217" (UID: "7436ccc7-650f-4c72-8424-b68258770217"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:24 crc kubenswrapper[4742]: I0317 11:34:24.971398 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.017645 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-config-data" (OuterVolumeSpecName: "config-data") pod "7436ccc7-650f-4c72-8424-b68258770217" (UID: "7436ccc7-650f-4c72-8424-b68258770217"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.017806 4742 scope.go:117] "RemoveContainer" containerID="6cdf3fbc57c84fd18359084ed0db93da1fac58dcd110f0fdeb978202cd2f9552" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.035095 4742 scope.go:117] "RemoveContainer" containerID="ed69d7748b71bae81f1a303fa0b9b3022833052a6a30694b653e181bd12ed9e0" Mar 17 11:34:25 crc kubenswrapper[4742]: E0317 11:34:25.035598 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed69d7748b71bae81f1a303fa0b9b3022833052a6a30694b653e181bd12ed9e0\": container with ID starting with ed69d7748b71bae81f1a303fa0b9b3022833052a6a30694b653e181bd12ed9e0 not found: ID does not exist" containerID="ed69d7748b71bae81f1a303fa0b9b3022833052a6a30694b653e181bd12ed9e0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.035691 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed69d7748b71bae81f1a303fa0b9b3022833052a6a30694b653e181bd12ed9e0"} err="failed to get container status \"ed69d7748b71bae81f1a303fa0b9b3022833052a6a30694b653e181bd12ed9e0\": rpc error: code = NotFound desc = could not find container \"ed69d7748b71bae81f1a303fa0b9b3022833052a6a30694b653e181bd12ed9e0\": container with ID starting with ed69d7748b71bae81f1a303fa0b9b3022833052a6a30694b653e181bd12ed9e0 not found: ID does not exist" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.035765 4742 scope.go:117] "RemoveContainer" containerID="e68da5f22c628b7c48c32eef77a3314500ef7524f09c7537f81b336ea1fda4fd" Mar 17 11:34:25 crc kubenswrapper[4742]: E0317 11:34:25.036065 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68da5f22c628b7c48c32eef77a3314500ef7524f09c7537f81b336ea1fda4fd\": container with ID starting with e68da5f22c628b7c48c32eef77a3314500ef7524f09c7537f81b336ea1fda4fd not found: ID does not exist" containerID="e68da5f22c628b7c48c32eef77a3314500ef7524f09c7537f81b336ea1fda4fd" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.036142 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68da5f22c628b7c48c32eef77a3314500ef7524f09c7537f81b336ea1fda4fd"} err="failed to get container status \"e68da5f22c628b7c48c32eef77a3314500ef7524f09c7537f81b336ea1fda4fd\": rpc error: code = NotFound desc = could not find container \"e68da5f22c628b7c48c32eef77a3314500ef7524f09c7537f81b336ea1fda4fd\": container with ID starting with e68da5f22c628b7c48c32eef77a3314500ef7524f09c7537f81b336ea1fda4fd not found: ID does not exist" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.036208 4742 scope.go:117] "RemoveContainer" containerID="098897db03562c7689f259753d4df593dfbd96b8685fec72881e0fbd60268b8d" Mar 17 11:34:25 crc kubenswrapper[4742]: E0317 11:34:25.036491 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"098897db03562c7689f259753d4df593dfbd96b8685fec72881e0fbd60268b8d\": container with ID starting with 098897db03562c7689f259753d4df593dfbd96b8685fec72881e0fbd60268b8d not found: ID does not exist" containerID="098897db03562c7689f259753d4df593dfbd96b8685fec72881e0fbd60268b8d" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.036563 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"098897db03562c7689f259753d4df593dfbd96b8685fec72881e0fbd60268b8d"} err="failed to get container status \"098897db03562c7689f259753d4df593dfbd96b8685fec72881e0fbd60268b8d\": rpc error: code = NotFound desc = could not find container \"098897db03562c7689f259753d4df593dfbd96b8685fec72881e0fbd60268b8d\": container with ID starting with 098897db03562c7689f259753d4df593dfbd96b8685fec72881e0fbd60268b8d not found: ID does not exist" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.036632 4742 scope.go:117] "RemoveContainer" containerID="6cdf3fbc57c84fd18359084ed0db93da1fac58dcd110f0fdeb978202cd2f9552" Mar 17 11:34:25 crc kubenswrapper[4742]: E0317 11:34:25.036959 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cdf3fbc57c84fd18359084ed0db93da1fac58dcd110f0fdeb978202cd2f9552\": container with ID starting with 6cdf3fbc57c84fd18359084ed0db93da1fac58dcd110f0fdeb978202cd2f9552 not found: ID does not exist" containerID="6cdf3fbc57c84fd18359084ed0db93da1fac58dcd110f0fdeb978202cd2f9552" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.037045 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cdf3fbc57c84fd18359084ed0db93da1fac58dcd110f0fdeb978202cd2f9552"} err="failed to get container status \"6cdf3fbc57c84fd18359084ed0db93da1fac58dcd110f0fdeb978202cd2f9552\": rpc error: code = NotFound desc = could not find container \"6cdf3fbc57c84fd18359084ed0db93da1fac58dcd110f0fdeb978202cd2f9552\": container with ID starting with 6cdf3fbc57c84fd18359084ed0db93da1fac58dcd110f0fdeb978202cd2f9552 not found: ID does not exist" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.073140 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7436ccc7-650f-4c72-8424-b68258770217-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.147711 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.162686 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.174267 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:25 crc kubenswrapper[4742]: E0317 11:34:25.174635 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="proxy-httpd" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.174653 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="proxy-httpd" Mar 17 11:34:25 crc kubenswrapper[4742]: E0317 11:34:25.174666 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="ceilometer-central-agent" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.174672 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="ceilometer-central-agent" Mar 17 11:34:25 crc kubenswrapper[4742]: E0317 11:34:25.174693 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="sg-core" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.174699 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="sg-core" Mar 17 11:34:25 crc kubenswrapper[4742]: E0317 11:34:25.174711 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="ceilometer-notification-agent" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.174717 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="ceilometer-notification-agent" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.174882 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="ceilometer-central-agent" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.174895 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="proxy-httpd" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.174930 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="ceilometer-notification-agent" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.174955 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="7436ccc7-650f-4c72-8424-b68258770217" containerName="sg-core" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.176561 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.180772 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.186679 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.197415 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.276198 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0ea41b4-8c5a-42e8-b589-db1ac541b789-run-httpd\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.276266 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0ea41b4-8c5a-42e8-b589-db1ac541b789-log-httpd\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.276309 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v74ct\" (UniqueName: \"kubernetes.io/projected/f0ea41b4-8c5a-42e8-b589-db1ac541b789-kube-api-access-v74ct\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.276393 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.276506 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-config-data\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.276591 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.277163 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-scripts\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.379248 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-scripts\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.379340 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0ea41b4-8c5a-42e8-b589-db1ac541b789-run-httpd\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.379399 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0ea41b4-8c5a-42e8-b589-db1ac541b789-log-httpd\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.379881 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0ea41b4-8c5a-42e8-b589-db1ac541b789-run-httpd\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.380017 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0ea41b4-8c5a-42e8-b589-db1ac541b789-log-httpd\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.379478 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v74ct\" (UniqueName: \"kubernetes.io/projected/f0ea41b4-8c5a-42e8-b589-db1ac541b789-kube-api-access-v74ct\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.380187 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.380734 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-config-data\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.380782 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.384707 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.384992 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-scripts\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.386489 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-config-data\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.388075 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.403474 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v74ct\" (UniqueName: \"kubernetes.io/projected/f0ea41b4-8c5a-42e8-b589-db1ac541b789-kube-api-access-v74ct\") pod \"ceilometer-0\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.503347 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:34:25 crc kubenswrapper[4742]: I0317 11:34:25.986581 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:34:26 crc kubenswrapper[4742]: I0317 11:34:26.677849 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7436ccc7-650f-4c72-8424-b68258770217" path="/var/lib/kubelet/pods/7436ccc7-650f-4c72-8424-b68258770217/volumes" Mar 17 11:34:26 crc kubenswrapper[4742]: I0317 11:34:26.834485 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0ea41b4-8c5a-42e8-b589-db1ac541b789","Type":"ContainerStarted","Data":"2b7be7092615d2f9627c9bb0be3ad0a70df0cdc04fb9ecfaf80ff88584f3d28a"} Mar 17 11:34:26 crc kubenswrapper[4742]: I0317 11:34:26.834921 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0ea41b4-8c5a-42e8-b589-db1ac541b789","Type":"ContainerStarted","Data":"4bd7b6d149488424331d085373296789fc3431cf57e2dcf477078853a6608686"} Mar 17 11:34:27 crc kubenswrapper[4742]: I0317 11:34:27.859392 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0ea41b4-8c5a-42e8-b589-db1ac541b789","Type":"ContainerStarted","Data":"e08a58bd15adbd148ff4db09ba73a2e6dfd8e8873feb7e70224ff14bd5d80a1b"} Mar 17 11:34:28 crc kubenswrapper[4742]: I0317 11:34:28.869813 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0ea41b4-8c5a-42e8-b589-db1ac541b789","Type":"ContainerStarted","Data":"20519f8eb41ddffbdd35517d36ad95844f9dfc611ffae1a481ddfbdf1a7723fa"} Mar 17 11:34:30 crc kubenswrapper[4742]: I0317 11:34:30.893875 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0ea41b4-8c5a-42e8-b589-db1ac541b789","Type":"ContainerStarted","Data":"f194506c036e69d9442a09efbf9c930196c4920974d106053879cdab4935fee5"} Mar 17 11:34:30 crc kubenswrapper[4742]: I0317 11:34:30.895030 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 11:34:30 crc kubenswrapper[4742]: I0317 11:34:30.929573 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.979642098 podStartE2EDuration="5.929551235s" podCreationTimestamp="2026-03-17 11:34:25 +0000 UTC" firstStartedPulling="2026-03-17 11:34:25.99601411 +0000 UTC m=+1369.122141868" lastFinishedPulling="2026-03-17 11:34:29.945923237 +0000 UTC m=+1373.072051005" observedRunningTime="2026-03-17 11:34:30.923877767 +0000 UTC m=+1374.050005555" watchObservedRunningTime="2026-03-17 11:34:30.929551235 +0000 UTC m=+1374.055679003" Mar 17 11:34:33 crc kubenswrapper[4742]: I0317 11:34:33.490301 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 17 11:34:33 crc kubenswrapper[4742]: I0317 11:34:33.993635 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mdzj6"] Mar 17 11:34:33 crc kubenswrapper[4742]: I0317 11:34:33.995039 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.000128 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.000131 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.006075 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mdzj6"] Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.143181 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.144466 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.146620 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.167433 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mdzj6\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.167514 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-config-data\") pod \"nova-cell0-cell-mapping-mdzj6\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.167532 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-scripts\") pod \"nova-cell0-cell-mapping-mdzj6\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.167615 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv57p\" (UniqueName: \"kubernetes.io/projected/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-kube-api-access-sv57p\") pod \"nova-cell0-cell-mapping-mdzj6\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.179035 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.247954 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.249218 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.263966 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.268965 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv57p\" (UniqueName: \"kubernetes.io/projected/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-kube-api-access-sv57p\") pod \"nova-cell0-cell-mapping-mdzj6\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.269013 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea48ccf-3d8b-43ec-a543-44f0217629b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.269050 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz64d\" (UniqueName: \"kubernetes.io/projected/7ea48ccf-3d8b-43ec-a543-44f0217629b5-kube-api-access-bz64d\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.269069 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mdzj6\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.269087 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea48ccf-3d8b-43ec-a543-44f0217629b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.269131 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-config-data\") pod \"nova-cell0-cell-mapping-mdzj6\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.269153 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-scripts\") pod \"nova-cell0-cell-mapping-mdzj6\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.279429 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.284611 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mdzj6\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.305584 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-config-data\") pod \"nova-cell0-cell-mapping-mdzj6\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.306291 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.311434 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.317254 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-scripts\") pod \"nova-cell0-cell-mapping-mdzj6\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.340333 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.374012 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea48ccf-3d8b-43ec-a543-44f0217629b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.374102 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl2p2\" (UniqueName: \"kubernetes.io/projected/dbdf90ea-46a2-4da7-a034-110be67d31b4-kube-api-access-cl2p2\") pod \"nova-scheduler-0\" (UID: \"dbdf90ea-46a2-4da7-a034-110be67d31b4\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.374127 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz64d\" (UniqueName: \"kubernetes.io/projected/7ea48ccf-3d8b-43ec-a543-44f0217629b5-kube-api-access-bz64d\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.374152 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea48ccf-3d8b-43ec-a543-44f0217629b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.374192 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdf90ea-46a2-4da7-a034-110be67d31b4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbdf90ea-46a2-4da7-a034-110be67d31b4\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.374236 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbdf90ea-46a2-4da7-a034-110be67d31b4-config-data\") pod \"nova-scheduler-0\" (UID: \"dbdf90ea-46a2-4da7-a034-110be67d31b4\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.387586 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea48ccf-3d8b-43ec-a543-44f0217629b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.406180 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea48ccf-3d8b-43ec-a543-44f0217629b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.418454 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.431564 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv57p\" (UniqueName: \"kubernetes.io/projected/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-kube-api-access-sv57p\") pod \"nova-cell0-cell-mapping-mdzj6\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.439516 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz64d\" (UniqueName: \"kubernetes.io/projected/7ea48ccf-3d8b-43ec-a543-44f0217629b5-kube-api-access-bz64d\") pod \"nova-cell1-novncproxy-0\" (UID: \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.473496 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.481846 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdf90ea-46a2-4da7-a034-110be67d31b4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbdf90ea-46a2-4da7-a034-110be67d31b4\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.481920 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbdf90ea-46a2-4da7-a034-110be67d31b4-config-data\") pod \"nova-scheduler-0\" (UID: \"dbdf90ea-46a2-4da7-a034-110be67d31b4\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.481947 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn7j4\" (UniqueName: \"kubernetes.io/projected/9e114c04-a3c2-4c59-921e-a4f2289024e0-kube-api-access-cn7j4\") pod \"nova-metadata-0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.482011 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e114c04-a3c2-4c59-921e-a4f2289024e0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.482042 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e114c04-a3c2-4c59-921e-a4f2289024e0-config-data\") pod \"nova-metadata-0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.482084 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e114c04-a3c2-4c59-921e-a4f2289024e0-logs\") pod \"nova-metadata-0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.482112 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl2p2\" (UniqueName: \"kubernetes.io/projected/dbdf90ea-46a2-4da7-a034-110be67d31b4-kube-api-access-cl2p2\") pod \"nova-scheduler-0\" (UID: \"dbdf90ea-46a2-4da7-a034-110be67d31b4\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.491518 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdf90ea-46a2-4da7-a034-110be67d31b4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbdf90ea-46a2-4da7-a034-110be67d31b4\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.516729 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbdf90ea-46a2-4da7-a034-110be67d31b4-config-data\") pod \"nova-scheduler-0\" (UID: \"dbdf90ea-46a2-4da7-a034-110be67d31b4\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.550223 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl2p2\" (UniqueName: \"kubernetes.io/projected/dbdf90ea-46a2-4da7-a034-110be67d31b4-kube-api-access-cl2p2\") pod \"nova-scheduler-0\" (UID: \"dbdf90ea-46a2-4da7-a034-110be67d31b4\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.583558 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e114c04-a3c2-4c59-921e-a4f2289024e0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.583801 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e114c04-a3c2-4c59-921e-a4f2289024e0-config-data\") pod \"nova-metadata-0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.583850 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e114c04-a3c2-4c59-921e-a4f2289024e0-logs\") pod \"nova-metadata-0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.583930 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn7j4\" (UniqueName: \"kubernetes.io/projected/9e114c04-a3c2-4c59-921e-a4f2289024e0-kube-api-access-cn7j4\") pod \"nova-metadata-0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.584880 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e114c04-a3c2-4c59-921e-a4f2289024e0-logs\") pod \"nova-metadata-0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.626112 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.632206 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e114c04-a3c2-4c59-921e-a4f2289024e0-config-data\") pod \"nova-metadata-0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.632285 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-rvllq"] Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.633891 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.658835 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e114c04-a3c2-4c59-921e-a4f2289024e0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.666522 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn7j4\" (UniqueName: \"kubernetes.io/projected/9e114c04-a3c2-4c59-921e-a4f2289024e0-kube-api-access-cn7j4\") pod \"nova-metadata-0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.695326 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.740410 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.741932 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-rvllq"] Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.741957 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.742392 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.755248 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.789322 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.789385 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-config\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.789448 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.789480 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-dns-svc\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.789503 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.789517 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4m2k\" (UniqueName: \"kubernetes.io/projected/215c4d89-a098-4983-8deb-44ba6bbfced4-kube-api-access-l4m2k\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.883429 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.891601 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-config\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.891654 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " pod="openstack/nova-api-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.891677 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdce0158-784b-45d3-ac02-836040197f41-logs\") pod \"nova-api-0\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " pod="openstack/nova-api-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.891736 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.891768 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-dns-svc\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.891787 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkb99\" (UniqueName: \"kubernetes.io/projected/cdce0158-784b-45d3-ac02-836040197f41-kube-api-access-kkb99\") pod \"nova-api-0\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " pod="openstack/nova-api-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.891808 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.891828 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4m2k\" (UniqueName: \"kubernetes.io/projected/215c4d89-a098-4983-8deb-44ba6bbfced4-kube-api-access-l4m2k\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.891870 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-config-data\") pod \"nova-api-0\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " pod="openstack/nova-api-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.891893 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.895983 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-dns-svc\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.896945 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-config\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.897510 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.898404 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.898425 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.969947 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4m2k\" (UniqueName: \"kubernetes.io/projected/215c4d89-a098-4983-8deb-44ba6bbfced4-kube-api-access-l4m2k\") pod \"dnsmasq-dns-757b4f8459-rvllq\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.997530 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " pod="openstack/nova-api-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.997596 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdce0158-784b-45d3-ac02-836040197f41-logs\") pod \"nova-api-0\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " pod="openstack/nova-api-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.997670 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkb99\" (UniqueName: \"kubernetes.io/projected/cdce0158-784b-45d3-ac02-836040197f41-kube-api-access-kkb99\") pod \"nova-api-0\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " pod="openstack/nova-api-0" Mar 17 11:34:34 crc kubenswrapper[4742]: I0317 11:34:34.997735 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-config-data\") pod \"nova-api-0\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " pod="openstack/nova-api-0" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.003135 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-config-data\") pod \"nova-api-0\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " pod="openstack/nova-api-0" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.003227 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdce0158-784b-45d3-ac02-836040197f41-logs\") pod \"nova-api-0\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " pod="openstack/nova-api-0" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.006451 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " pod="openstack/nova-api-0" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.025464 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkb99\" (UniqueName: \"kubernetes.io/projected/cdce0158-784b-45d3-ac02-836040197f41-kube-api-access-kkb99\") pod \"nova-api-0\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " pod="openstack/nova-api-0" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.042330 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.061487 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 11:34:35 crc kubenswrapper[4742]: W0317 11:34:35.089850 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ea48ccf_3d8b_43ec_a543_44f0217629b5.slice/crio-236eddf691ccf858ef6473ff255a927df6256a3987c1982b42b68ab2a7179661 WatchSource:0}: Error finding container 236eddf691ccf858ef6473ff255a927df6256a3987c1982b42b68ab2a7179661: Status 404 returned error can't find the container with id 236eddf691ccf858ef6473ff255a927df6256a3987c1982b42b68ab2a7179661 Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.115687 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:34:35 crc kubenswrapper[4742]: W0317 11:34:35.256685 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba4702f6_9538_41c9_b1fe_f31f4ef9f4c9.slice/crio-63fee69be98d10032d5a5ec2e931802dd6cb861455ee8c54ab6091b10626798f WatchSource:0}: Error finding container 63fee69be98d10032d5a5ec2e931802dd6cb861455ee8c54ab6091b10626798f: Status 404 returned error can't find the container with id 63fee69be98d10032d5a5ec2e931802dd6cb861455ee8c54ab6091b10626798f Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.278881 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mdzj6"] Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.432446 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:34:35 crc kubenswrapper[4742]: W0317 11:34:35.437742 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbdf90ea_46a2_4da7_a034_110be67d31b4.slice/crio-b8bab2e19605559b14e8775bbad920fd4170102eca0208d54cdbb2ddbb62e9d5 WatchSource:0}: Error finding container b8bab2e19605559b14e8775bbad920fd4170102eca0208d54cdbb2ddbb62e9d5: Status 404 returned error can't find the container with id b8bab2e19605559b14e8775bbad920fd4170102eca0208d54cdbb2ddbb62e9d5 Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.478742 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2xsng"] Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.479820 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.482425 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.482622 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.488803 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2xsng"] Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.597861 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.626093 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-config-data\") pod \"nova-cell1-conductor-db-sync-2xsng\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.626338 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2xsng\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.626364 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-scripts\") pod \"nova-cell1-conductor-db-sync-2xsng\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.626423 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4h5h\" (UniqueName: \"kubernetes.io/projected/a3bf0544-e60c-4ca7-b535-7d43244a766b-kube-api-access-t4h5h\") pod \"nova-cell1-conductor-db-sync-2xsng\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: W0317 11:34:35.697301 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdce0158_784b_45d3_ac02_836040197f41.slice/crio-24f9f113298f0e8da828083b227884813d76b4d01896efd8fd23bd7139036cb7 WatchSource:0}: Error finding container 24f9f113298f0e8da828083b227884813d76b4d01896efd8fd23bd7139036cb7: Status 404 returned error can't find the container with id 24f9f113298f0e8da828083b227884813d76b4d01896efd8fd23bd7139036cb7 Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.701504 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-rvllq"] Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.717458 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.728130 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-config-data\") pod \"nova-cell1-conductor-db-sync-2xsng\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.728188 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2xsng\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.728226 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-scripts\") pod \"nova-cell1-conductor-db-sync-2xsng\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.728348 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4h5h\" (UniqueName: \"kubernetes.io/projected/a3bf0544-e60c-4ca7-b535-7d43244a766b-kube-api-access-t4h5h\") pod \"nova-cell1-conductor-db-sync-2xsng\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.731929 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-scripts\") pod \"nova-cell1-conductor-db-sync-2xsng\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.732295 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-config-data\") pod \"nova-cell1-conductor-db-sync-2xsng\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.737608 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2xsng\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.749586 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4h5h\" (UniqueName: \"kubernetes.io/projected/a3bf0544-e60c-4ca7-b535-7d43244a766b-kube-api-access-t4h5h\") pod \"nova-cell1-conductor-db-sync-2xsng\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.803302 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.977514 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" event={"ID":"215c4d89-a098-4983-8deb-44ba6bbfced4","Type":"ContainerStarted","Data":"42662be949242104888bd445725e1c3b63e5ca039880f167694c5fe6351582c8"} Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.977839 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" event={"ID":"215c4d89-a098-4983-8deb-44ba6bbfced4","Type":"ContainerStarted","Data":"97dcc4a46915249d07fc35133e926fdf0449379974b73565e08a640e421e1bcc"} Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.989170 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbdf90ea-46a2-4da7-a034-110be67d31b4","Type":"ContainerStarted","Data":"b8bab2e19605559b14e8775bbad920fd4170102eca0208d54cdbb2ddbb62e9d5"} Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.995985 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mdzj6" event={"ID":"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9","Type":"ContainerStarted","Data":"2e36d6941e4aed6d67d8c74f4fa6ab620b2b6d0eaadcd77a0c001881f7b87bfd"} Mar 17 11:34:35 crc kubenswrapper[4742]: I0317 11:34:35.996024 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mdzj6" event={"ID":"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9","Type":"ContainerStarted","Data":"63fee69be98d10032d5a5ec2e931802dd6cb861455ee8c54ab6091b10626798f"} Mar 17 11:34:36 crc kubenswrapper[4742]: I0317 11:34:36.010338 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e114c04-a3c2-4c59-921e-a4f2289024e0","Type":"ContainerStarted","Data":"d3956651a9783da445b3db9f576cf39a23e7f80a6796dbc3b4bb7172e8c4ad7f"} Mar 17 11:34:36 crc kubenswrapper[4742]: I0317 11:34:36.022958 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mdzj6" podStartSLOduration=3.022940098 podStartE2EDuration="3.022940098s" podCreationTimestamp="2026-03-17 11:34:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:34:36.017394933 +0000 UTC m=+1379.143522691" watchObservedRunningTime="2026-03-17 11:34:36.022940098 +0000 UTC m=+1379.149067856" Mar 17 11:34:36 crc kubenswrapper[4742]: I0317 11:34:36.023758 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7ea48ccf-3d8b-43ec-a543-44f0217629b5","Type":"ContainerStarted","Data":"236eddf691ccf858ef6473ff255a927df6256a3987c1982b42b68ab2a7179661"} Mar 17 11:34:36 crc kubenswrapper[4742]: I0317 11:34:36.031303 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cdce0158-784b-45d3-ac02-836040197f41","Type":"ContainerStarted","Data":"24f9f113298f0e8da828083b227884813d76b4d01896efd8fd23bd7139036cb7"} Mar 17 11:34:36 crc kubenswrapper[4742]: I0317 11:34:36.297568 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2xsng"] Mar 17 11:34:36 crc kubenswrapper[4742]: W0317 11:34:36.378435 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3bf0544_e60c_4ca7_b535_7d43244a766b.slice/crio-3ea351674adfce6e603ea20349a8cea9f4e009258f72768b998e3dd4d1fb8b05 WatchSource:0}: Error finding container 3ea351674adfce6e603ea20349a8cea9f4e009258f72768b998e3dd4d1fb8b05: Status 404 returned error can't find the container with id 3ea351674adfce6e603ea20349a8cea9f4e009258f72768b998e3dd4d1fb8b05 Mar 17 11:34:37 crc kubenswrapper[4742]: I0317 11:34:37.046426 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2xsng" event={"ID":"a3bf0544-e60c-4ca7-b535-7d43244a766b","Type":"ContainerStarted","Data":"05ab3e8b616f46616a6aed4d636187b891aebc6db5276ac080318e5c8b5f1902"} Mar 17 11:34:37 crc kubenswrapper[4742]: I0317 11:34:37.046747 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2xsng" event={"ID":"a3bf0544-e60c-4ca7-b535-7d43244a766b","Type":"ContainerStarted","Data":"3ea351674adfce6e603ea20349a8cea9f4e009258f72768b998e3dd4d1fb8b05"} Mar 17 11:34:37 crc kubenswrapper[4742]: I0317 11:34:37.049192 4742 generic.go:334] "Generic (PLEG): container finished" podID="215c4d89-a098-4983-8deb-44ba6bbfced4" containerID="42662be949242104888bd445725e1c3b63e5ca039880f167694c5fe6351582c8" exitCode=0 Mar 17 11:34:37 crc kubenswrapper[4742]: I0317 11:34:37.049291 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" event={"ID":"215c4d89-a098-4983-8deb-44ba6bbfced4","Type":"ContainerDied","Data":"42662be949242104888bd445725e1c3b63e5ca039880f167694c5fe6351582c8"} Mar 17 11:34:37 crc kubenswrapper[4742]: I0317 11:34:37.095595 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-2xsng" podStartSLOduration=2.095578788 podStartE2EDuration="2.095578788s" podCreationTimestamp="2026-03-17 11:34:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:34:37.071943119 +0000 UTC m=+1380.198070897" watchObservedRunningTime="2026-03-17 11:34:37.095578788 +0000 UTC m=+1380.221706536" Mar 17 11:34:37 crc kubenswrapper[4742]: I0317 11:34:37.833038 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:37 crc kubenswrapper[4742]: I0317 11:34:37.844697 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.086025 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e114c04-a3c2-4c59-921e-a4f2289024e0","Type":"ContainerStarted","Data":"01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95"} Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.086443 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e114c04-a3c2-4c59-921e-a4f2289024e0","Type":"ContainerStarted","Data":"87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2"} Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.086369 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9e114c04-a3c2-4c59-921e-a4f2289024e0" containerName="nova-metadata-log" containerID="cri-o://87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2" gracePeriod=30 Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.086665 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9e114c04-a3c2-4c59-921e-a4f2289024e0" containerName="nova-metadata-metadata" containerID="cri-o://01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95" gracePeriod=30 Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.088669 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7ea48ccf-3d8b-43ec-a543-44f0217629b5","Type":"ContainerStarted","Data":"d00ef0e9f48c07fbf65c07740aa130a3e125491bb81b8b0b5b44c29082bda891"} Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.088779 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="7ea48ccf-3d8b-43ec-a543-44f0217629b5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d00ef0e9f48c07fbf65c07740aa130a3e125491bb81b8b0b5b44c29082bda891" gracePeriod=30 Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.102491 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cdce0158-784b-45d3-ac02-836040197f41","Type":"ContainerStarted","Data":"5b1074880b680bf40d712f2edd00f5c7b1a5d12f2708005f67ef6645cf94c49b"} Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.102532 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cdce0158-784b-45d3-ac02-836040197f41","Type":"ContainerStarted","Data":"bdfb3372a83de7fdeaf9f6d19ec598e5f8da8213710a3ce0471c95baaef1c047"} Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.115117 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" event={"ID":"215c4d89-a098-4983-8deb-44ba6bbfced4","Type":"ContainerStarted","Data":"fc005c29098b135cc972ca2293ca45e898145aba5a0df8605de5f85a94bbacab"} Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.115463 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.130531 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.743484133 podStartE2EDuration="6.13051177s" podCreationTimestamp="2026-03-17 11:34:34 +0000 UTC" firstStartedPulling="2026-03-17 11:34:35.599575176 +0000 UTC m=+1378.725702934" lastFinishedPulling="2026-03-17 11:34:38.986602813 +0000 UTC m=+1382.112730571" observedRunningTime="2026-03-17 11:34:40.114444852 +0000 UTC m=+1383.240572610" watchObservedRunningTime="2026-03-17 11:34:40.13051177 +0000 UTC m=+1383.256639528" Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.131456 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbdf90ea-46a2-4da7-a034-110be67d31b4","Type":"ContainerStarted","Data":"a6624aca090960d7b7bd76465764617a3d795195b702c693bc2f139f5d94864a"} Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.147736 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.266163847 podStartE2EDuration="6.147723859s" podCreationTimestamp="2026-03-17 11:34:34 +0000 UTC" firstStartedPulling="2026-03-17 11:34:35.111528281 +0000 UTC m=+1378.237656039" lastFinishedPulling="2026-03-17 11:34:38.993088273 +0000 UTC m=+1382.119216051" observedRunningTime="2026-03-17 11:34:40.146269809 +0000 UTC m=+1383.272397567" watchObservedRunningTime="2026-03-17 11:34:40.147723859 +0000 UTC m=+1383.273851607" Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.223142 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" podStartSLOduration=6.223124681 podStartE2EDuration="6.223124681s" podCreationTimestamp="2026-03-17 11:34:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:34:40.220386985 +0000 UTC m=+1383.346514743" watchObservedRunningTime="2026-03-17 11:34:40.223124681 +0000 UTC m=+1383.349252429" Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.236754 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.962347124 podStartE2EDuration="6.236738121s" podCreationTimestamp="2026-03-17 11:34:34 +0000 UTC" firstStartedPulling="2026-03-17 11:34:35.702367391 +0000 UTC m=+1378.828495149" lastFinishedPulling="2026-03-17 11:34:38.976758388 +0000 UTC m=+1382.102886146" observedRunningTime="2026-03-17 11:34:40.189277798 +0000 UTC m=+1383.315405556" watchObservedRunningTime="2026-03-17 11:34:40.236738121 +0000 UTC m=+1383.362865879" Mar 17 11:34:40 crc kubenswrapper[4742]: I0317 11:34:40.243737 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.700640569 podStartE2EDuration="6.243723376s" podCreationTimestamp="2026-03-17 11:34:34 +0000 UTC" firstStartedPulling="2026-03-17 11:34:35.443525056 +0000 UTC m=+1378.569652814" lastFinishedPulling="2026-03-17 11:34:38.986607863 +0000 UTC m=+1382.112735621" observedRunningTime="2026-03-17 11:34:40.243277843 +0000 UTC m=+1383.369405611" watchObservedRunningTime="2026-03-17 11:34:40.243723376 +0000 UTC m=+1383.369851134" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.067186 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.144698 4742 generic.go:334] "Generic (PLEG): container finished" podID="9e114c04-a3c2-4c59-921e-a4f2289024e0" containerID="01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95" exitCode=0 Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.145176 4742 generic.go:334] "Generic (PLEG): container finished" podID="9e114c04-a3c2-4c59-921e-a4f2289024e0" containerID="87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2" exitCode=143 Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.144759 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e114c04-a3c2-4c59-921e-a4f2289024e0","Type":"ContainerDied","Data":"01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95"} Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.145278 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e114c04-a3c2-4c59-921e-a4f2289024e0","Type":"ContainerDied","Data":"87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2"} Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.146178 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9e114c04-a3c2-4c59-921e-a4f2289024e0","Type":"ContainerDied","Data":"d3956651a9783da445b3db9f576cf39a23e7f80a6796dbc3b4bb7172e8c4ad7f"} Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.145428 4742 scope.go:117] "RemoveContainer" containerID="01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.144741 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.159565 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn7j4\" (UniqueName: \"kubernetes.io/projected/9e114c04-a3c2-4c59-921e-a4f2289024e0-kube-api-access-cn7j4\") pod \"9e114c04-a3c2-4c59-921e-a4f2289024e0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.159694 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e114c04-a3c2-4c59-921e-a4f2289024e0-combined-ca-bundle\") pod \"9e114c04-a3c2-4c59-921e-a4f2289024e0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.159727 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e114c04-a3c2-4c59-921e-a4f2289024e0-config-data\") pod \"9e114c04-a3c2-4c59-921e-a4f2289024e0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.159804 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e114c04-a3c2-4c59-921e-a4f2289024e0-logs\") pod \"9e114c04-a3c2-4c59-921e-a4f2289024e0\" (UID: \"9e114c04-a3c2-4c59-921e-a4f2289024e0\") " Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.161421 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e114c04-a3c2-4c59-921e-a4f2289024e0-logs" (OuterVolumeSpecName: "logs") pod "9e114c04-a3c2-4c59-921e-a4f2289024e0" (UID: "9e114c04-a3c2-4c59-921e-a4f2289024e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.174067 4742 scope.go:117] "RemoveContainer" containerID="87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.183281 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e114c04-a3c2-4c59-921e-a4f2289024e0-kube-api-access-cn7j4" (OuterVolumeSpecName: "kube-api-access-cn7j4") pod "9e114c04-a3c2-4c59-921e-a4f2289024e0" (UID: "9e114c04-a3c2-4c59-921e-a4f2289024e0"). InnerVolumeSpecName "kube-api-access-cn7j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.186734 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e114c04-a3c2-4c59-921e-a4f2289024e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e114c04-a3c2-4c59-921e-a4f2289024e0" (UID: "9e114c04-a3c2-4c59-921e-a4f2289024e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.203593 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e114c04-a3c2-4c59-921e-a4f2289024e0-config-data" (OuterVolumeSpecName: "config-data") pod "9e114c04-a3c2-4c59-921e-a4f2289024e0" (UID: "9e114c04-a3c2-4c59-921e-a4f2289024e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.261751 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e114c04-a3c2-4c59-921e-a4f2289024e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.261787 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e114c04-a3c2-4c59-921e-a4f2289024e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.261799 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e114c04-a3c2-4c59-921e-a4f2289024e0-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.261811 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn7j4\" (UniqueName: \"kubernetes.io/projected/9e114c04-a3c2-4c59-921e-a4f2289024e0-kube-api-access-cn7j4\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.275961 4742 scope.go:117] "RemoveContainer" containerID="01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95" Mar 17 11:34:41 crc kubenswrapper[4742]: E0317 11:34:41.276428 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95\": container with ID starting with 01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95 not found: ID does not exist" containerID="01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.276474 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95"} err="failed to get container status \"01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95\": rpc error: code = NotFound desc = could not find container \"01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95\": container with ID starting with 01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95 not found: ID does not exist" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.276500 4742 scope.go:117] "RemoveContainer" containerID="87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2" Mar 17 11:34:41 crc kubenswrapper[4742]: E0317 11:34:41.276976 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2\": container with ID starting with 87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2 not found: ID does not exist" containerID="87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.277048 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2"} err="failed to get container status \"87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2\": rpc error: code = NotFound desc = could not find container \"87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2\": container with ID starting with 87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2 not found: ID does not exist" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.277074 4742 scope.go:117] "RemoveContainer" containerID="01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.278183 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95"} err="failed to get container status \"01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95\": rpc error: code = NotFound desc = could not find container \"01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95\": container with ID starting with 01833947b5907460da51d35524f76cf570f68ab051e221b63d462c225dbcae95 not found: ID does not exist" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.278219 4742 scope.go:117] "RemoveContainer" containerID="87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.278555 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2"} err="failed to get container status \"87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2\": rpc error: code = NotFound desc = could not find container \"87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2\": container with ID starting with 87a2158d02d151c157ca277c364f8901d65cb387cc835b598132fdcef798e6c2 not found: ID does not exist" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.483844 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.494323 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.539780 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:41 crc kubenswrapper[4742]: E0317 11:34:41.540483 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e114c04-a3c2-4c59-921e-a4f2289024e0" containerName="nova-metadata-metadata" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.540515 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e114c04-a3c2-4c59-921e-a4f2289024e0" containerName="nova-metadata-metadata" Mar 17 11:34:41 crc kubenswrapper[4742]: E0317 11:34:41.540549 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e114c04-a3c2-4c59-921e-a4f2289024e0" containerName="nova-metadata-log" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.540562 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e114c04-a3c2-4c59-921e-a4f2289024e0" containerName="nova-metadata-log" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.540928 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e114c04-a3c2-4c59-921e-a4f2289024e0" containerName="nova-metadata-log" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.540988 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e114c04-a3c2-4c59-921e-a4f2289024e0" containerName="nova-metadata-metadata" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.542902 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.546896 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.547879 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.553695 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.669069 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.669110 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbw27\" (UniqueName: \"kubernetes.io/projected/c0127628-2fe4-4099-9b5f-8780c3d8555a-kube-api-access-cbw27\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.669191 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-config-data\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.669211 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.669531 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0127628-2fe4-4099-9b5f-8780c3d8555a-logs\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.771204 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-config-data\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.771267 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.771332 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0127628-2fe4-4099-9b5f-8780c3d8555a-logs\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.771426 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.771449 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbw27\" (UniqueName: \"kubernetes.io/projected/c0127628-2fe4-4099-9b5f-8780c3d8555a-kube-api-access-cbw27\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.772288 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0127628-2fe4-4099-9b5f-8780c3d8555a-logs\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.776494 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.778174 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.790605 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-config-data\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.791154 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbw27\" (UniqueName: \"kubernetes.io/projected/c0127628-2fe4-4099-9b5f-8780c3d8555a-kube-api-access-cbw27\") pod \"nova-metadata-0\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " pod="openstack/nova-metadata-0" Mar 17 11:34:41 crc kubenswrapper[4742]: I0317 11:34:41.858226 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:34:42 crc kubenswrapper[4742]: I0317 11:34:42.381311 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:42 crc kubenswrapper[4742]: I0317 11:34:42.686171 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e114c04-a3c2-4c59-921e-a4f2289024e0" path="/var/lib/kubelet/pods/9e114c04-a3c2-4c59-921e-a4f2289024e0/volumes" Mar 17 11:34:43 crc kubenswrapper[4742]: I0317 11:34:43.174100 4742 generic.go:334] "Generic (PLEG): container finished" podID="ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9" containerID="2e36d6941e4aed6d67d8c74f4fa6ab620b2b6d0eaadcd77a0c001881f7b87bfd" exitCode=0 Mar 17 11:34:43 crc kubenswrapper[4742]: I0317 11:34:43.174231 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mdzj6" event={"ID":"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9","Type":"ContainerDied","Data":"2e36d6941e4aed6d67d8c74f4fa6ab620b2b6d0eaadcd77a0c001881f7b87bfd"} Mar 17 11:34:43 crc kubenswrapper[4742]: I0317 11:34:43.176780 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0127628-2fe4-4099-9b5f-8780c3d8555a","Type":"ContainerStarted","Data":"7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3"} Mar 17 11:34:43 crc kubenswrapper[4742]: I0317 11:34:43.176830 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0127628-2fe4-4099-9b5f-8780c3d8555a","Type":"ContainerStarted","Data":"83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab"} Mar 17 11:34:43 crc kubenswrapper[4742]: I0317 11:34:43.176843 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0127628-2fe4-4099-9b5f-8780c3d8555a","Type":"ContainerStarted","Data":"6a9f741cff2aec8ac60b1a7242e06dc56b26461bb5f4e9c17d40ee1edcdec90b"} Mar 17 11:34:43 crc kubenswrapper[4742]: I0317 11:34:43.227749 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.227727327 podStartE2EDuration="2.227727327s" podCreationTimestamp="2026-03-17 11:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:34:43.219470827 +0000 UTC m=+1386.345598615" watchObservedRunningTime="2026-03-17 11:34:43.227727327 +0000 UTC m=+1386.353855095" Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.215337 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2xsng" event={"ID":"a3bf0544-e60c-4ca7-b535-7d43244a766b","Type":"ContainerDied","Data":"05ab3e8b616f46616a6aed4d636187b891aebc6db5276ac080318e5c8b5f1902"} Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.216051 4742 generic.go:334] "Generic (PLEG): container finished" podID="a3bf0544-e60c-4ca7-b535-7d43244a766b" containerID="05ab3e8b616f46616a6aed4d636187b891aebc6db5276ac080318e5c8b5f1902" exitCode=0 Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.478683 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.696730 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.696893 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.705957 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.736103 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.737490 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv57p\" (UniqueName: \"kubernetes.io/projected/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-kube-api-access-sv57p\") pod \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.737535 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-scripts\") pod \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.737632 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-combined-ca-bundle\") pod \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.737702 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-config-data\") pod \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\" (UID: \"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9\") " Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.745175 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-kube-api-access-sv57p" (OuterVolumeSpecName: "kube-api-access-sv57p") pod "ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9" (UID: "ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9"). InnerVolumeSpecName "kube-api-access-sv57p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.746849 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-scripts" (OuterVolumeSpecName: "scripts") pod "ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9" (UID: "ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.767104 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9" (UID: "ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.795105 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-config-data" (OuterVolumeSpecName: "config-data") pod "ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9" (UID: "ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.840539 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv57p\" (UniqueName: \"kubernetes.io/projected/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-kube-api-access-sv57p\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.840571 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.840581 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:44 crc kubenswrapper[4742]: I0317 11:34:44.840588 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.044197 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.115856 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.115937 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.131058 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-ljkt4"] Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.131385 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" podUID="5948988b-0036-4d62-9511-23a900c10b83" containerName="dnsmasq-dns" containerID="cri-o://65cd1ca5aa87160533fcaec3928dfcfce67c009df9274d2dd03654bff8b066a6" gracePeriod=10 Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.228186 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mdzj6" event={"ID":"ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9","Type":"ContainerDied","Data":"63fee69be98d10032d5a5ec2e931802dd6cb861455ee8c54ab6091b10626798f"} Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.228224 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mdzj6" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.228228 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63fee69be98d10032d5a5ec2e931802dd6cb861455ee8c54ab6091b10626798f" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.303054 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.405537 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.405753 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cdce0158-784b-45d3-ac02-836040197f41" containerName="nova-api-log" containerID="cri-o://bdfb3372a83de7fdeaf9f6d19ec598e5f8da8213710a3ce0471c95baaef1c047" gracePeriod=30 Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.405840 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cdce0158-784b-45d3-ac02-836040197f41" containerName="nova-api-api" containerID="cri-o://5b1074880b680bf40d712f2edd00f5c7b1a5d12f2708005f67ef6645cf94c49b" gracePeriod=30 Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.411684 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" podUID="5948988b-0036-4d62-9511-23a900c10b83" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.172:5353: connect: connection refused" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.417318 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cdce0158-784b-45d3-ac02-836040197f41" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": EOF" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.417407 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cdce0158-784b-45d3-ac02-836040197f41" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": EOF" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.428458 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.428829 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0127628-2fe4-4099-9b5f-8780c3d8555a" containerName="nova-metadata-log" containerID="cri-o://83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab" gracePeriod=30 Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.429054 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0127628-2fe4-4099-9b5f-8780c3d8555a" containerName="nova-metadata-metadata" containerID="cri-o://7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3" gracePeriod=30 Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.795643 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.797267 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.867356 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-config\") pod \"5948988b-0036-4d62-9511-23a900c10b83\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.867440 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bxkc\" (UniqueName: \"kubernetes.io/projected/5948988b-0036-4d62-9511-23a900c10b83-kube-api-access-9bxkc\") pod \"5948988b-0036-4d62-9511-23a900c10b83\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.868162 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-scripts\") pod \"a3bf0544-e60c-4ca7-b535-7d43244a766b\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.868244 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-ovsdbserver-nb\") pod \"5948988b-0036-4d62-9511-23a900c10b83\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.868283 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-dns-swift-storage-0\") pod \"5948988b-0036-4d62-9511-23a900c10b83\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.868312 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-config-data\") pod \"a3bf0544-e60c-4ca7-b535-7d43244a766b\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.868336 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4h5h\" (UniqueName: \"kubernetes.io/projected/a3bf0544-e60c-4ca7-b535-7d43244a766b-kube-api-access-t4h5h\") pod \"a3bf0544-e60c-4ca7-b535-7d43244a766b\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.868370 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-dns-svc\") pod \"5948988b-0036-4d62-9511-23a900c10b83\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.868478 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-ovsdbserver-sb\") pod \"5948988b-0036-4d62-9511-23a900c10b83\" (UID: \"5948988b-0036-4d62-9511-23a900c10b83\") " Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.868530 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-combined-ca-bundle\") pod \"a3bf0544-e60c-4ca7-b535-7d43244a766b\" (UID: \"a3bf0544-e60c-4ca7-b535-7d43244a766b\") " Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.886731 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-scripts" (OuterVolumeSpecName: "scripts") pod "a3bf0544-e60c-4ca7-b535-7d43244a766b" (UID: "a3bf0544-e60c-4ca7-b535-7d43244a766b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.902856 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.903080 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3bf0544-e60c-4ca7-b535-7d43244a766b-kube-api-access-t4h5h" (OuterVolumeSpecName: "kube-api-access-t4h5h") pod "a3bf0544-e60c-4ca7-b535-7d43244a766b" (UID: "a3bf0544-e60c-4ca7-b535-7d43244a766b"). InnerVolumeSpecName "kube-api-access-t4h5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.932141 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5948988b-0036-4d62-9511-23a900c10b83-kube-api-access-9bxkc" (OuterVolumeSpecName: "kube-api-access-9bxkc") pod "5948988b-0036-4d62-9511-23a900c10b83" (UID: "5948988b-0036-4d62-9511-23a900c10b83"). InnerVolumeSpecName "kube-api-access-9bxkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.970890 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5948988b-0036-4d62-9511-23a900c10b83" (UID: "5948988b-0036-4d62-9511-23a900c10b83"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.971377 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bxkc\" (UniqueName: \"kubernetes.io/projected/5948988b-0036-4d62-9511-23a900c10b83-kube-api-access-9bxkc\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.971397 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.971406 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4h5h\" (UniqueName: \"kubernetes.io/projected/a3bf0544-e60c-4ca7-b535-7d43244a766b-kube-api-access-t4h5h\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.971415 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.983021 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3bf0544-e60c-4ca7-b535-7d43244a766b" (UID: "a3bf0544-e60c-4ca7-b535-7d43244a766b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:45 crc kubenswrapper[4742]: I0317 11:34:45.989334 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5948988b-0036-4d62-9511-23a900c10b83" (UID: "5948988b-0036-4d62-9511-23a900c10b83"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.005519 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5948988b-0036-4d62-9511-23a900c10b83" (UID: "5948988b-0036-4d62-9511-23a900c10b83"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.016615 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-config-data" (OuterVolumeSpecName: "config-data") pod "a3bf0544-e60c-4ca7-b535-7d43244a766b" (UID: "a3bf0544-e60c-4ca7-b535-7d43244a766b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.032728 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-config" (OuterVolumeSpecName: "config") pod "5948988b-0036-4d62-9511-23a900c10b83" (UID: "5948988b-0036-4d62-9511-23a900c10b83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.035041 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5948988b-0036-4d62-9511-23a900c10b83" (UID: "5948988b-0036-4d62-9511-23a900c10b83"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.040421 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.078326 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-combined-ca-bundle\") pod \"c0127628-2fe4-4099-9b5f-8780c3d8555a\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.078491 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-config-data\") pod \"c0127628-2fe4-4099-9b5f-8780c3d8555a\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.078637 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbw27\" (UniqueName: \"kubernetes.io/projected/c0127628-2fe4-4099-9b5f-8780c3d8555a-kube-api-access-cbw27\") pod \"c0127628-2fe4-4099-9b5f-8780c3d8555a\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.079276 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0127628-2fe4-4099-9b5f-8780c3d8555a-logs\") pod \"c0127628-2fe4-4099-9b5f-8780c3d8555a\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.079429 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-nova-metadata-tls-certs\") pod \"c0127628-2fe4-4099-9b5f-8780c3d8555a\" (UID: \"c0127628-2fe4-4099-9b5f-8780c3d8555a\") " Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.080826 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.081177 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0127628-2fe4-4099-9b5f-8780c3d8555a-logs" (OuterVolumeSpecName: "logs") pod "c0127628-2fe4-4099-9b5f-8780c3d8555a" (UID: "c0127628-2fe4-4099-9b5f-8780c3d8555a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.081194 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.081258 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.081276 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.081289 4742 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5948988b-0036-4d62-9511-23a900c10b83-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.081302 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3bf0544-e60c-4ca7-b535-7d43244a766b-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.084202 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0127628-2fe4-4099-9b5f-8780c3d8555a-kube-api-access-cbw27" (OuterVolumeSpecName: "kube-api-access-cbw27") pod "c0127628-2fe4-4099-9b5f-8780c3d8555a" (UID: "c0127628-2fe4-4099-9b5f-8780c3d8555a"). InnerVolumeSpecName "kube-api-access-cbw27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.106980 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-config-data" (OuterVolumeSpecName: "config-data") pod "c0127628-2fe4-4099-9b5f-8780c3d8555a" (UID: "c0127628-2fe4-4099-9b5f-8780c3d8555a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.123301 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0127628-2fe4-4099-9b5f-8780c3d8555a" (UID: "c0127628-2fe4-4099-9b5f-8780c3d8555a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.143381 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c0127628-2fe4-4099-9b5f-8780c3d8555a" (UID: "c0127628-2fe4-4099-9b5f-8780c3d8555a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.182995 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.183052 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.183063 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbw27\" (UniqueName: \"kubernetes.io/projected/c0127628-2fe4-4099-9b5f-8780c3d8555a-kube-api-access-cbw27\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.183077 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0127628-2fe4-4099-9b5f-8780c3d8555a-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.183087 4742 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0127628-2fe4-4099-9b5f-8780c3d8555a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.236220 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2xsng" event={"ID":"a3bf0544-e60c-4ca7-b535-7d43244a766b","Type":"ContainerDied","Data":"3ea351674adfce6e603ea20349a8cea9f4e009258f72768b998e3dd4d1fb8b05"} Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.236269 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ea351674adfce6e603ea20349a8cea9f4e009258f72768b998e3dd4d1fb8b05" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.236318 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2xsng" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.243740 4742 generic.go:334] "Generic (PLEG): container finished" podID="c0127628-2fe4-4099-9b5f-8780c3d8555a" containerID="7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3" exitCode=0 Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.243771 4742 generic.go:334] "Generic (PLEG): container finished" podID="c0127628-2fe4-4099-9b5f-8780c3d8555a" containerID="83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab" exitCode=143 Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.243787 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.243832 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0127628-2fe4-4099-9b5f-8780c3d8555a","Type":"ContainerDied","Data":"7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3"} Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.243887 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0127628-2fe4-4099-9b5f-8780c3d8555a","Type":"ContainerDied","Data":"83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab"} Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.243897 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0127628-2fe4-4099-9b5f-8780c3d8555a","Type":"ContainerDied","Data":"6a9f741cff2aec8ac60b1a7242e06dc56b26461bb5f4e9c17d40ee1edcdec90b"} Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.243926 4742 scope.go:117] "RemoveContainer" containerID="7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.248343 4742 generic.go:334] "Generic (PLEG): container finished" podID="5948988b-0036-4d62-9511-23a900c10b83" containerID="65cd1ca5aa87160533fcaec3928dfcfce67c009df9274d2dd03654bff8b066a6" exitCode=0 Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.248378 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.248380 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" event={"ID":"5948988b-0036-4d62-9511-23a900c10b83","Type":"ContainerDied","Data":"65cd1ca5aa87160533fcaec3928dfcfce67c009df9274d2dd03654bff8b066a6"} Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.248415 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-ljkt4" event={"ID":"5948988b-0036-4d62-9511-23a900c10b83","Type":"ContainerDied","Data":"ea252bee8883e8bcbbbd22f6c1cc990375cbd39f55e5e9c1b2abb76c8ac79587"} Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.266650 4742 generic.go:334] "Generic (PLEG): container finished" podID="cdce0158-784b-45d3-ac02-836040197f41" containerID="bdfb3372a83de7fdeaf9f6d19ec598e5f8da8213710a3ce0471c95baaef1c047" exitCode=143 Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.266880 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cdce0158-784b-45d3-ac02-836040197f41","Type":"ContainerDied","Data":"bdfb3372a83de7fdeaf9f6d19ec598e5f8da8213710a3ce0471c95baaef1c047"} Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.279958 4742 scope.go:117] "RemoveContainer" containerID="83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.306020 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-ljkt4"] Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.314439 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-ljkt4"] Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.321264 4742 scope.go:117] "RemoveContainer" containerID="7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3" Mar 17 11:34:46 crc kubenswrapper[4742]: E0317 11:34:46.321664 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3\": container with ID starting with 7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3 not found: ID does not exist" containerID="7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.321693 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3"} err="failed to get container status \"7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3\": rpc error: code = NotFound desc = could not find container \"7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3\": container with ID starting with 7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3 not found: ID does not exist" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.321714 4742 scope.go:117] "RemoveContainer" containerID="83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab" Mar 17 11:34:46 crc kubenswrapper[4742]: E0317 11:34:46.322070 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab\": container with ID starting with 83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab not found: ID does not exist" containerID="83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.322092 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab"} err="failed to get container status \"83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab\": rpc error: code = NotFound desc = could not find container \"83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab\": container with ID starting with 83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab not found: ID does not exist" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.322106 4742 scope.go:117] "RemoveContainer" containerID="7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.322264 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 17 11:34:46 crc kubenswrapper[4742]: E0317 11:34:46.322773 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0127628-2fe4-4099-9b5f-8780c3d8555a" containerName="nova-metadata-metadata" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.322793 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0127628-2fe4-4099-9b5f-8780c3d8555a" containerName="nova-metadata-metadata" Mar 17 11:34:46 crc kubenswrapper[4742]: E0317 11:34:46.322805 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5948988b-0036-4d62-9511-23a900c10b83" containerName="init" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.322815 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5948988b-0036-4d62-9511-23a900c10b83" containerName="init" Mar 17 11:34:46 crc kubenswrapper[4742]: E0317 11:34:46.322827 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0127628-2fe4-4099-9b5f-8780c3d8555a" containerName="nova-metadata-log" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.322836 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0127628-2fe4-4099-9b5f-8780c3d8555a" containerName="nova-metadata-log" Mar 17 11:34:46 crc kubenswrapper[4742]: E0317 11:34:46.322859 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5948988b-0036-4d62-9511-23a900c10b83" containerName="dnsmasq-dns" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.322867 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5948988b-0036-4d62-9511-23a900c10b83" containerName="dnsmasq-dns" Mar 17 11:34:46 crc kubenswrapper[4742]: E0317 11:34:46.322890 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9" containerName="nova-manage" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.322900 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9" containerName="nova-manage" Mar 17 11:34:46 crc kubenswrapper[4742]: E0317 11:34:46.322964 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3bf0544-e60c-4ca7-b535-7d43244a766b" containerName="nova-cell1-conductor-db-sync" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.322973 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3bf0544-e60c-4ca7-b535-7d43244a766b" containerName="nova-cell1-conductor-db-sync" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.323191 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0127628-2fe4-4099-9b5f-8780c3d8555a" containerName="nova-metadata-metadata" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.323209 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9" containerName="nova-manage" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.323225 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="5948988b-0036-4d62-9511-23a900c10b83" containerName="dnsmasq-dns" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.323242 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0127628-2fe4-4099-9b5f-8780c3d8555a" containerName="nova-metadata-log" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.323263 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3bf0544-e60c-4ca7-b535-7d43244a766b" containerName="nova-cell1-conductor-db-sync" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.324072 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.324683 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3"} err="failed to get container status \"7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3\": rpc error: code = NotFound desc = could not find container \"7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3\": container with ID starting with 7b78b5b9f09f75a451bd63a326a30a7023a5902f9cdb720c4a123db0ace2d5f3 not found: ID does not exist" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.324711 4742 scope.go:117] "RemoveContainer" containerID="83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.325082 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab"} err="failed to get container status \"83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab\": rpc error: code = NotFound desc = could not find container \"83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab\": container with ID starting with 83e0bf08f93a2f1060719c37835501c1687cc068be2a2790731a11638e44ecab not found: ID does not exist" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.325103 4742 scope.go:117] "RemoveContainer" containerID="65cd1ca5aa87160533fcaec3928dfcfce67c009df9274d2dd03654bff8b066a6" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.328984 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.363690 4742 scope.go:117] "RemoveContainer" containerID="3c98d1b47193f72683a76ca8146296b6516a60dafc118d5658643e658d4a17fe" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.364451 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.387520 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ecd8fa-016c-43b5-9d9f-42c776c8e38d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"47ecd8fa-016c-43b5-9d9f-42c776c8e38d\") " pod="openstack/nova-cell1-conductor-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.387643 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6msk\" (UniqueName: \"kubernetes.io/projected/47ecd8fa-016c-43b5-9d9f-42c776c8e38d-kube-api-access-l6msk\") pod \"nova-cell1-conductor-0\" (UID: \"47ecd8fa-016c-43b5-9d9f-42c776c8e38d\") " pod="openstack/nova-cell1-conductor-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.387974 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ecd8fa-016c-43b5-9d9f-42c776c8e38d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"47ecd8fa-016c-43b5-9d9f-42c776c8e38d\") " pod="openstack/nova-cell1-conductor-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.394426 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.406344 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.407390 4742 scope.go:117] "RemoveContainer" containerID="65cd1ca5aa87160533fcaec3928dfcfce67c009df9274d2dd03654bff8b066a6" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.407900 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: E0317 11:34:46.410296 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65cd1ca5aa87160533fcaec3928dfcfce67c009df9274d2dd03654bff8b066a6\": container with ID starting with 65cd1ca5aa87160533fcaec3928dfcfce67c009df9274d2dd03654bff8b066a6 not found: ID does not exist" containerID="65cd1ca5aa87160533fcaec3928dfcfce67c009df9274d2dd03654bff8b066a6" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.410336 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65cd1ca5aa87160533fcaec3928dfcfce67c009df9274d2dd03654bff8b066a6"} err="failed to get container status \"65cd1ca5aa87160533fcaec3928dfcfce67c009df9274d2dd03654bff8b066a6\": rpc error: code = NotFound desc = could not find container \"65cd1ca5aa87160533fcaec3928dfcfce67c009df9274d2dd03654bff8b066a6\": container with ID starting with 65cd1ca5aa87160533fcaec3928dfcfce67c009df9274d2dd03654bff8b066a6 not found: ID does not exist" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.410370 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.410381 4742 scope.go:117] "RemoveContainer" containerID="3c98d1b47193f72683a76ca8146296b6516a60dafc118d5658643e658d4a17fe" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.410856 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 17 11:34:46 crc kubenswrapper[4742]: E0317 11:34:46.411061 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c98d1b47193f72683a76ca8146296b6516a60dafc118d5658643e658d4a17fe\": container with ID starting with 3c98d1b47193f72683a76ca8146296b6516a60dafc118d5658643e658d4a17fe not found: ID does not exist" containerID="3c98d1b47193f72683a76ca8146296b6516a60dafc118d5658643e658d4a17fe" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.411094 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c98d1b47193f72683a76ca8146296b6516a60dafc118d5658643e658d4a17fe"} err="failed to get container status \"3c98d1b47193f72683a76ca8146296b6516a60dafc118d5658643e658d4a17fe\": rpc error: code = NotFound desc = could not find container \"3c98d1b47193f72683a76ca8146296b6516a60dafc118d5658643e658d4a17fe\": container with ID starting with 3c98d1b47193f72683a76ca8146296b6516a60dafc118d5658643e658d4a17fe not found: ID does not exist" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.420090 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.436955 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.489811 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-config-data\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.489874 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ecd8fa-016c-43b5-9d9f-42c776c8e38d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"47ecd8fa-016c-43b5-9d9f-42c776c8e38d\") " pod="openstack/nova-cell1-conductor-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.489921 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdmwj\" (UniqueName: \"kubernetes.io/projected/f3d29251-108b-4705-8e84-36f40549b65c-kube-api-access-rdmwj\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.489958 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.489994 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6msk\" (UniqueName: \"kubernetes.io/projected/47ecd8fa-016c-43b5-9d9f-42c776c8e38d-kube-api-access-l6msk\") pod \"nova-cell1-conductor-0\" (UID: \"47ecd8fa-016c-43b5-9d9f-42c776c8e38d\") " pod="openstack/nova-cell1-conductor-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.490040 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ecd8fa-016c-43b5-9d9f-42c776c8e38d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"47ecd8fa-016c-43b5-9d9f-42c776c8e38d\") " pod="openstack/nova-cell1-conductor-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.490065 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3d29251-108b-4705-8e84-36f40549b65c-logs\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.490083 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.493209 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ecd8fa-016c-43b5-9d9f-42c776c8e38d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"47ecd8fa-016c-43b5-9d9f-42c776c8e38d\") " pod="openstack/nova-cell1-conductor-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.495527 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ecd8fa-016c-43b5-9d9f-42c776c8e38d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"47ecd8fa-016c-43b5-9d9f-42c776c8e38d\") " pod="openstack/nova-cell1-conductor-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.505523 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6msk\" (UniqueName: \"kubernetes.io/projected/47ecd8fa-016c-43b5-9d9f-42c776c8e38d-kube-api-access-l6msk\") pod \"nova-cell1-conductor-0\" (UID: \"47ecd8fa-016c-43b5-9d9f-42c776c8e38d\") " pod="openstack/nova-cell1-conductor-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.591710 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdmwj\" (UniqueName: \"kubernetes.io/projected/f3d29251-108b-4705-8e84-36f40549b65c-kube-api-access-rdmwj\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.591770 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.591849 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3d29251-108b-4705-8e84-36f40549b65c-logs\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.591870 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.591923 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-config-data\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.592469 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3d29251-108b-4705-8e84-36f40549b65c-logs\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.600309 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.600409 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-config-data\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.606922 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.615581 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdmwj\" (UniqueName: \"kubernetes.io/projected/f3d29251-108b-4705-8e84-36f40549b65c-kube-api-access-rdmwj\") pod \"nova-metadata-0\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.652362 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.675371 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5948988b-0036-4d62-9511-23a900c10b83" path="/var/lib/kubelet/pods/5948988b-0036-4d62-9511-23a900c10b83/volumes" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.676171 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0127628-2fe4-4099-9b5f-8780c3d8555a" path="/var/lib/kubelet/pods/c0127628-2fe4-4099-9b5f-8780c3d8555a/volumes" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.737282 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:34:46 crc kubenswrapper[4742]: I0317 11:34:46.802546 4742 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pode15fe5ee-73d7-415a-a61c-a0e67d085f3a"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pode15fe5ee-73d7-415a-a61c-a0e67d085f3a] : Timed out while waiting for systemd to remove kubepods-besteffort-pode15fe5ee_73d7_415a_a61c_a0e67d085f3a.slice" Mar 17 11:34:47 crc kubenswrapper[4742]: I0317 11:34:47.200718 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 17 11:34:47 crc kubenswrapper[4742]: W0317 11:34:47.228008 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47ecd8fa_016c_43b5_9d9f_42c776c8e38d.slice/crio-f61b05a87acf0152f9c1db6c1e6efc72e7a6bce5490e40465ef15e402b216941 WatchSource:0}: Error finding container f61b05a87acf0152f9c1db6c1e6efc72e7a6bce5490e40465ef15e402b216941: Status 404 returned error can't find the container with id f61b05a87acf0152f9c1db6c1e6efc72e7a6bce5490e40465ef15e402b216941 Mar 17 11:34:47 crc kubenswrapper[4742]: I0317 11:34:47.296559 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"47ecd8fa-016c-43b5-9d9f-42c776c8e38d","Type":"ContainerStarted","Data":"f61b05a87acf0152f9c1db6c1e6efc72e7a6bce5490e40465ef15e402b216941"} Mar 17 11:34:47 crc kubenswrapper[4742]: I0317 11:34:47.300341 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dbdf90ea-46a2-4da7-a034-110be67d31b4" containerName="nova-scheduler-scheduler" containerID="cri-o://a6624aca090960d7b7bd76465764617a3d795195b702c693bc2f139f5d94864a" gracePeriod=30 Mar 17 11:34:47 crc kubenswrapper[4742]: I0317 11:34:47.304206 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:34:48 crc kubenswrapper[4742]: I0317 11:34:48.044266 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:34:48 crc kubenswrapper[4742]: I0317 11:34:48.045063 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:34:48 crc kubenswrapper[4742]: I0317 11:34:48.311834 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3d29251-108b-4705-8e84-36f40549b65c","Type":"ContainerStarted","Data":"172696b90d932e902a62e2470cac16687684f759a9723f95ea5cf6bcbd54800b"} Mar 17 11:34:48 crc kubenswrapper[4742]: I0317 11:34:48.311874 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3d29251-108b-4705-8e84-36f40549b65c","Type":"ContainerStarted","Data":"fb473ad5e8e64e4c8a9277822c3a05864f6e39fa05d7f5e41705574a7d40dc84"} Mar 17 11:34:48 crc kubenswrapper[4742]: I0317 11:34:48.311884 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3d29251-108b-4705-8e84-36f40549b65c","Type":"ContainerStarted","Data":"335c6319b37f14bb22053f6caa256db30c9f4a418b89f2273c45507bd9c3c2a5"} Mar 17 11:34:48 crc kubenswrapper[4742]: I0317 11:34:48.315146 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"47ecd8fa-016c-43b5-9d9f-42c776c8e38d","Type":"ContainerStarted","Data":"c6b413a79fe6b22b688453f9ddcea4eb215f2c685cd7abca6b35d06834a98364"} Mar 17 11:34:48 crc kubenswrapper[4742]: I0317 11:34:48.315479 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 17 11:34:48 crc kubenswrapper[4742]: I0317 11:34:48.344606 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.344587693 podStartE2EDuration="2.344587693s" podCreationTimestamp="2026-03-17 11:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:34:48.335334756 +0000 UTC m=+1391.461462524" watchObservedRunningTime="2026-03-17 11:34:48.344587693 +0000 UTC m=+1391.470715451" Mar 17 11:34:48 crc kubenswrapper[4742]: I0317 11:34:48.362014 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.362000319 podStartE2EDuration="2.362000319s" podCreationTimestamp="2026-03-17 11:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:34:48.359011526 +0000 UTC m=+1391.485139284" watchObservedRunningTime="2026-03-17 11:34:48.362000319 +0000 UTC m=+1391.488128077" Mar 17 11:34:49 crc kubenswrapper[4742]: E0317 11:34:49.698002 4742 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a6624aca090960d7b7bd76465764617a3d795195b702c693bc2f139f5d94864a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 17 11:34:49 crc kubenswrapper[4742]: E0317 11:34:49.699265 4742 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a6624aca090960d7b7bd76465764617a3d795195b702c693bc2f139f5d94864a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 17 11:34:49 crc kubenswrapper[4742]: E0317 11:34:49.700217 4742 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a6624aca090960d7b7bd76465764617a3d795195b702c693bc2f139f5d94864a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 17 11:34:49 crc kubenswrapper[4742]: E0317 11:34:49.700250 4742 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dbdf90ea-46a2-4da7-a034-110be67d31b4" containerName="nova-scheduler-scheduler" Mar 17 11:34:51 crc kubenswrapper[4742]: I0317 11:34:51.355150 4742 generic.go:334] "Generic (PLEG): container finished" podID="dbdf90ea-46a2-4da7-a034-110be67d31b4" containerID="a6624aca090960d7b7bd76465764617a3d795195b702c693bc2f139f5d94864a" exitCode=0 Mar 17 11:34:51 crc kubenswrapper[4742]: I0317 11:34:51.355272 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbdf90ea-46a2-4da7-a034-110be67d31b4","Type":"ContainerDied","Data":"a6624aca090960d7b7bd76465764617a3d795195b702c693bc2f139f5d94864a"} Mar 17 11:34:51 crc kubenswrapper[4742]: I0317 11:34:51.439340 4742 scope.go:117] "RemoveContainer" containerID="aba1e2013cc35d6dd6c50570b0324114bfdcc8cd54dc6f804c76a5cc8e0c5862" Mar 17 11:34:51 crc kubenswrapper[4742]: I0317 11:34:51.705738 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 11:34:51 crc kubenswrapper[4742]: I0317 11:34:51.808872 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl2p2\" (UniqueName: \"kubernetes.io/projected/dbdf90ea-46a2-4da7-a034-110be67d31b4-kube-api-access-cl2p2\") pod \"dbdf90ea-46a2-4da7-a034-110be67d31b4\" (UID: \"dbdf90ea-46a2-4da7-a034-110be67d31b4\") " Mar 17 11:34:51 crc kubenswrapper[4742]: I0317 11:34:51.809033 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdf90ea-46a2-4da7-a034-110be67d31b4-combined-ca-bundle\") pod \"dbdf90ea-46a2-4da7-a034-110be67d31b4\" (UID: \"dbdf90ea-46a2-4da7-a034-110be67d31b4\") " Mar 17 11:34:51 crc kubenswrapper[4742]: I0317 11:34:51.809133 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbdf90ea-46a2-4da7-a034-110be67d31b4-config-data\") pod \"dbdf90ea-46a2-4da7-a034-110be67d31b4\" (UID: \"dbdf90ea-46a2-4da7-a034-110be67d31b4\") " Mar 17 11:34:51 crc kubenswrapper[4742]: I0317 11:34:51.814589 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbdf90ea-46a2-4da7-a034-110be67d31b4-kube-api-access-cl2p2" (OuterVolumeSpecName: "kube-api-access-cl2p2") pod "dbdf90ea-46a2-4da7-a034-110be67d31b4" (UID: "dbdf90ea-46a2-4da7-a034-110be67d31b4"). InnerVolumeSpecName "kube-api-access-cl2p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:51 crc kubenswrapper[4742]: I0317 11:34:51.837286 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdf90ea-46a2-4da7-a034-110be67d31b4-config-data" (OuterVolumeSpecName: "config-data") pod "dbdf90ea-46a2-4da7-a034-110be67d31b4" (UID: "dbdf90ea-46a2-4da7-a034-110be67d31b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:51 crc kubenswrapper[4742]: I0317 11:34:51.845220 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbdf90ea-46a2-4da7-a034-110be67d31b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbdf90ea-46a2-4da7-a034-110be67d31b4" (UID: "dbdf90ea-46a2-4da7-a034-110be67d31b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:51 crc kubenswrapper[4742]: I0317 11:34:51.911277 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl2p2\" (UniqueName: \"kubernetes.io/projected/dbdf90ea-46a2-4da7-a034-110be67d31b4-kube-api-access-cl2p2\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:51 crc kubenswrapper[4742]: I0317 11:34:51.911314 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbdf90ea-46a2-4da7-a034-110be67d31b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:51 crc kubenswrapper[4742]: I0317 11:34:51.911327 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbdf90ea-46a2-4da7-a034-110be67d31b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.186042 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.321034 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkb99\" (UniqueName: \"kubernetes.io/projected/cdce0158-784b-45d3-ac02-836040197f41-kube-api-access-kkb99\") pod \"cdce0158-784b-45d3-ac02-836040197f41\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.321115 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdce0158-784b-45d3-ac02-836040197f41-logs\") pod \"cdce0158-784b-45d3-ac02-836040197f41\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.321161 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-combined-ca-bundle\") pod \"cdce0158-784b-45d3-ac02-836040197f41\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.321197 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-config-data\") pod \"cdce0158-784b-45d3-ac02-836040197f41\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.321551 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdce0158-784b-45d3-ac02-836040197f41-logs" (OuterVolumeSpecName: "logs") pod "cdce0158-784b-45d3-ac02-836040197f41" (UID: "cdce0158-784b-45d3-ac02-836040197f41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.322119 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdce0158-784b-45d3-ac02-836040197f41-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.325055 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdce0158-784b-45d3-ac02-836040197f41-kube-api-access-kkb99" (OuterVolumeSpecName: "kube-api-access-kkb99") pod "cdce0158-784b-45d3-ac02-836040197f41" (UID: "cdce0158-784b-45d3-ac02-836040197f41"). InnerVolumeSpecName "kube-api-access-kkb99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:34:52 crc kubenswrapper[4742]: E0317 11:34:52.342001 4742 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-combined-ca-bundle podName:cdce0158-784b-45d3-ac02-836040197f41 nodeName:}" failed. No retries permitted until 2026-03-17 11:34:52.841977763 +0000 UTC m=+1395.968105521 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-combined-ca-bundle") pod "cdce0158-784b-45d3-ac02-836040197f41" (UID: "cdce0158-784b-45d3-ac02-836040197f41") : error deleting /var/lib/kubelet/pods/cdce0158-784b-45d3-ac02-836040197f41/volume-subpaths: remove /var/lib/kubelet/pods/cdce0158-784b-45d3-ac02-836040197f41/volume-subpaths: no such file or directory Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.344331 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-config-data" (OuterVolumeSpecName: "config-data") pod "cdce0158-784b-45d3-ac02-836040197f41" (UID: "cdce0158-784b-45d3-ac02-836040197f41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.367636 4742 generic.go:334] "Generic (PLEG): container finished" podID="cdce0158-784b-45d3-ac02-836040197f41" containerID="5b1074880b680bf40d712f2edd00f5c7b1a5d12f2708005f67ef6645cf94c49b" exitCode=0 Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.367696 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cdce0158-784b-45d3-ac02-836040197f41","Type":"ContainerDied","Data":"5b1074880b680bf40d712f2edd00f5c7b1a5d12f2708005f67ef6645cf94c49b"} Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.367746 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cdce0158-784b-45d3-ac02-836040197f41","Type":"ContainerDied","Data":"24f9f113298f0e8da828083b227884813d76b4d01896efd8fd23bd7139036cb7"} Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.367765 4742 scope.go:117] "RemoveContainer" containerID="5b1074880b680bf40d712f2edd00f5c7b1a5d12f2708005f67ef6645cf94c49b" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.367875 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.369022 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbdf90ea-46a2-4da7-a034-110be67d31b4","Type":"ContainerDied","Data":"b8bab2e19605559b14e8775bbad920fd4170102eca0208d54cdbb2ddbb62e9d5"} Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.369054 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.423537 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.423947 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkb99\" (UniqueName: \"kubernetes.io/projected/cdce0158-784b-45d3-ac02-836040197f41-kube-api-access-kkb99\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.427510 4742 scope.go:117] "RemoveContainer" containerID="bdfb3372a83de7fdeaf9f6d19ec598e5f8da8213710a3ce0471c95baaef1c047" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.427949 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.444992 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.451645 4742 scope.go:117] "RemoveContainer" containerID="5b1074880b680bf40d712f2edd00f5c7b1a5d12f2708005f67ef6645cf94c49b" Mar 17 11:34:52 crc kubenswrapper[4742]: E0317 11:34:52.452227 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b1074880b680bf40d712f2edd00f5c7b1a5d12f2708005f67ef6645cf94c49b\": container with ID starting with 5b1074880b680bf40d712f2edd00f5c7b1a5d12f2708005f67ef6645cf94c49b not found: ID does not exist" containerID="5b1074880b680bf40d712f2edd00f5c7b1a5d12f2708005f67ef6645cf94c49b" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.452285 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1074880b680bf40d712f2edd00f5c7b1a5d12f2708005f67ef6645cf94c49b"} err="failed to get container status \"5b1074880b680bf40d712f2edd00f5c7b1a5d12f2708005f67ef6645cf94c49b\": rpc error: code = NotFound desc = could not find container \"5b1074880b680bf40d712f2edd00f5c7b1a5d12f2708005f67ef6645cf94c49b\": container with ID starting with 5b1074880b680bf40d712f2edd00f5c7b1a5d12f2708005f67ef6645cf94c49b not found: ID does not exist" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.452329 4742 scope.go:117] "RemoveContainer" containerID="bdfb3372a83de7fdeaf9f6d19ec598e5f8da8213710a3ce0471c95baaef1c047" Mar 17 11:34:52 crc kubenswrapper[4742]: E0317 11:34:52.452949 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdfb3372a83de7fdeaf9f6d19ec598e5f8da8213710a3ce0471c95baaef1c047\": container with ID starting with bdfb3372a83de7fdeaf9f6d19ec598e5f8da8213710a3ce0471c95baaef1c047 not found: ID does not exist" containerID="bdfb3372a83de7fdeaf9f6d19ec598e5f8da8213710a3ce0471c95baaef1c047" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.452988 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdfb3372a83de7fdeaf9f6d19ec598e5f8da8213710a3ce0471c95baaef1c047"} err="failed to get container status \"bdfb3372a83de7fdeaf9f6d19ec598e5f8da8213710a3ce0471c95baaef1c047\": rpc error: code = NotFound desc = could not find container \"bdfb3372a83de7fdeaf9f6d19ec598e5f8da8213710a3ce0471c95baaef1c047\": container with ID starting with bdfb3372a83de7fdeaf9f6d19ec598e5f8da8213710a3ce0471c95baaef1c047 not found: ID does not exist" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.453012 4742 scope.go:117] "RemoveContainer" containerID="a6624aca090960d7b7bd76465764617a3d795195b702c693bc2f139f5d94864a" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.482096 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:34:52 crc kubenswrapper[4742]: E0317 11:34:52.483432 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdce0158-784b-45d3-ac02-836040197f41" containerName="nova-api-log" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.483468 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdce0158-784b-45d3-ac02-836040197f41" containerName="nova-api-log" Mar 17 11:34:52 crc kubenswrapper[4742]: E0317 11:34:52.483564 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbdf90ea-46a2-4da7-a034-110be67d31b4" containerName="nova-scheduler-scheduler" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.483579 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbdf90ea-46a2-4da7-a034-110be67d31b4" containerName="nova-scheduler-scheduler" Mar 17 11:34:52 crc kubenswrapper[4742]: E0317 11:34:52.483616 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdce0158-784b-45d3-ac02-836040197f41" containerName="nova-api-api" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.483630 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdce0158-784b-45d3-ac02-836040197f41" containerName="nova-api-api" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.484368 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdce0158-784b-45d3-ac02-836040197f41" containerName="nova-api-api" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.484418 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbdf90ea-46a2-4da7-a034-110be67d31b4" containerName="nova-scheduler-scheduler" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.484465 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdce0158-784b-45d3-ac02-836040197f41" containerName="nova-api-log" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.486285 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.489727 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.498101 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.525506 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9ttb\" (UniqueName: \"kubernetes.io/projected/cbcfc054-1dcd-4a3e-a79d-7574f434b972-kube-api-access-z9ttb\") pod \"nova-scheduler-0\" (UID: \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.525585 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbcfc054-1dcd-4a3e-a79d-7574f434b972-config-data\") pod \"nova-scheduler-0\" (UID: \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.525604 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcfc054-1dcd-4a3e-a79d-7574f434b972-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.626717 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9ttb\" (UniqueName: \"kubernetes.io/projected/cbcfc054-1dcd-4a3e-a79d-7574f434b972-kube-api-access-z9ttb\") pod \"nova-scheduler-0\" (UID: \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.626795 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbcfc054-1dcd-4a3e-a79d-7574f434b972-config-data\") pod \"nova-scheduler-0\" (UID: \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.626815 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcfc054-1dcd-4a3e-a79d-7574f434b972-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.636575 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbcfc054-1dcd-4a3e-a79d-7574f434b972-config-data\") pod \"nova-scheduler-0\" (UID: \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.636584 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcfc054-1dcd-4a3e-a79d-7574f434b972-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.642852 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9ttb\" (UniqueName: \"kubernetes.io/projected/cbcfc054-1dcd-4a3e-a79d-7574f434b972-kube-api-access-z9ttb\") pod \"nova-scheduler-0\" (UID: \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\") " pod="openstack/nova-scheduler-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.676978 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbdf90ea-46a2-4da7-a034-110be67d31b4" path="/var/lib/kubelet/pods/dbdf90ea-46a2-4da7-a034-110be67d31b4/volumes" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.823494 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.933573 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-combined-ca-bundle\") pod \"cdce0158-784b-45d3-ac02-836040197f41\" (UID: \"cdce0158-784b-45d3-ac02-836040197f41\") " Mar 17 11:34:52 crc kubenswrapper[4742]: I0317 11:34:52.940504 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdce0158-784b-45d3-ac02-836040197f41" (UID: "cdce0158-784b-45d3-ac02-836040197f41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.036454 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdce0158-784b-45d3-ac02-836040197f41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.038135 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.048573 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.057561 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.059009 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.062145 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.066246 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.139622 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f18526ff-17a2-445c-b949-5d4a129c7807-config-data\") pod \"nova-api-0\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.139896 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh77h\" (UniqueName: \"kubernetes.io/projected/f18526ff-17a2-445c-b949-5d4a129c7807-kube-api-access-hh77h\") pod \"nova-api-0\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.140176 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f18526ff-17a2-445c-b949-5d4a129c7807-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.140571 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f18526ff-17a2-445c-b949-5d4a129c7807-logs\") pod \"nova-api-0\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.242969 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f18526ff-17a2-445c-b949-5d4a129c7807-logs\") pod \"nova-api-0\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.243162 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f18526ff-17a2-445c-b949-5d4a129c7807-config-data\") pod \"nova-api-0\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.243258 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh77h\" (UniqueName: \"kubernetes.io/projected/f18526ff-17a2-445c-b949-5d4a129c7807-kube-api-access-hh77h\") pod \"nova-api-0\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.243294 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f18526ff-17a2-445c-b949-5d4a129c7807-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.243979 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f18526ff-17a2-445c-b949-5d4a129c7807-logs\") pod \"nova-api-0\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.247392 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f18526ff-17a2-445c-b949-5d4a129c7807-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.248186 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f18526ff-17a2-445c-b949-5d4a129c7807-config-data\") pod \"nova-api-0\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.267284 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh77h\" (UniqueName: \"kubernetes.io/projected/f18526ff-17a2-445c-b949-5d4a129c7807-kube-api-access-hh77h\") pod \"nova-api-0\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.289435 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:34:53 crc kubenswrapper[4742]: W0317 11:34:53.290247 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbcfc054_1dcd_4a3e_a79d_7574f434b972.slice/crio-e11efd7380a8bfce30f44668e874d3fa51a47e3c6aa0dc9715fa6ffc79bccec0 WatchSource:0}: Error finding container e11efd7380a8bfce30f44668e874d3fa51a47e3c6aa0dc9715fa6ffc79bccec0: Status 404 returned error can't find the container with id e11efd7380a8bfce30f44668e874d3fa51a47e3c6aa0dc9715fa6ffc79bccec0 Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.376796 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.383090 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cbcfc054-1dcd-4a3e-a79d-7574f434b972","Type":"ContainerStarted","Data":"e11efd7380a8bfce30f44668e874d3fa51a47e3c6aa0dc9715fa6ffc79bccec0"} Mar 17 11:34:53 crc kubenswrapper[4742]: W0317 11:34:53.930580 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf18526ff_17a2_445c_b949_5d4a129c7807.slice/crio-569b6bf88e5d594984a827656d78d46ae1b1c6fcaf34d611618a612d2263877a WatchSource:0}: Error finding container 569b6bf88e5d594984a827656d78d46ae1b1c6fcaf34d611618a612d2263877a: Status 404 returned error can't find the container with id 569b6bf88e5d594984a827656d78d46ae1b1c6fcaf34d611618a612d2263877a Mar 17 11:34:53 crc kubenswrapper[4742]: I0317 11:34:53.932653 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:34:54 crc kubenswrapper[4742]: I0317 11:34:54.400952 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f18526ff-17a2-445c-b949-5d4a129c7807","Type":"ContainerStarted","Data":"ab024a1a847ec4183e2d6f4123e82c16eba58bff1b1288427fb43a029682c7a6"} Mar 17 11:34:54 crc kubenswrapper[4742]: I0317 11:34:54.401252 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f18526ff-17a2-445c-b949-5d4a129c7807","Type":"ContainerStarted","Data":"4bf89a006ab374cf59150d6248aa7aeb9f96c6cf25ed7c8c89f6f8ee8dedac9e"} Mar 17 11:34:54 crc kubenswrapper[4742]: I0317 11:34:54.401265 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f18526ff-17a2-445c-b949-5d4a129c7807","Type":"ContainerStarted","Data":"569b6bf88e5d594984a827656d78d46ae1b1c6fcaf34d611618a612d2263877a"} Mar 17 11:34:54 crc kubenswrapper[4742]: I0317 11:34:54.404025 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cbcfc054-1dcd-4a3e-a79d-7574f434b972","Type":"ContainerStarted","Data":"9513ee6d2312f082bac94466635f07cb10003f04774ca6525f2d14c10caebe35"} Mar 17 11:34:54 crc kubenswrapper[4742]: I0317 11:34:54.421682 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.421662596 podStartE2EDuration="1.421662596s" podCreationTimestamp="2026-03-17 11:34:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:34:54.419352871 +0000 UTC m=+1397.545480629" watchObservedRunningTime="2026-03-17 11:34:54.421662596 +0000 UTC m=+1397.547790354" Mar 17 11:34:54 crc kubenswrapper[4742]: I0317 11:34:54.438890 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.438871726 podStartE2EDuration="2.438871726s" podCreationTimestamp="2026-03-17 11:34:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:34:54.434419162 +0000 UTC m=+1397.560546920" watchObservedRunningTime="2026-03-17 11:34:54.438871726 +0000 UTC m=+1397.564999484" Mar 17 11:34:54 crc kubenswrapper[4742]: I0317 11:34:54.674279 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdce0158-784b-45d3-ac02-836040197f41" path="/var/lib/kubelet/pods/cdce0158-784b-45d3-ac02-836040197f41/volumes" Mar 17 11:34:55 crc kubenswrapper[4742]: I0317 11:34:55.509508 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 17 11:34:56 crc kubenswrapper[4742]: I0317 11:34:56.699864 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 17 11:34:56 crc kubenswrapper[4742]: I0317 11:34:56.738395 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 17 11:34:56 crc kubenswrapper[4742]: I0317 11:34:56.738441 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 17 11:34:57 crc kubenswrapper[4742]: I0317 11:34:57.753056 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f3d29251-108b-4705-8e84-36f40549b65c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 11:34:57 crc kubenswrapper[4742]: I0317 11:34:57.753071 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f3d29251-108b-4705-8e84-36f40549b65c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 11:34:57 crc kubenswrapper[4742]: I0317 11:34:57.824300 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 17 11:34:59 crc kubenswrapper[4742]: I0317 11:34:59.096629 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 11:34:59 crc kubenswrapper[4742]: I0317 11:34:59.097340 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b4dcc55d-79fc-492b-980b-527f9a71a89c" containerName="kube-state-metrics" containerID="cri-o://341eb69fcfb599dc7de91fa2492483dfbc6fa83f21a65d9c08516c60f0202081" gracePeriod=30 Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:34:59.474484 4742 generic.go:334] "Generic (PLEG): container finished" podID="b4dcc55d-79fc-492b-980b-527f9a71a89c" containerID="341eb69fcfb599dc7de91fa2492483dfbc6fa83f21a65d9c08516c60f0202081" exitCode=2 Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:34:59.474550 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b4dcc55d-79fc-492b-980b-527f9a71a89c","Type":"ContainerDied","Data":"341eb69fcfb599dc7de91fa2492483dfbc6fa83f21a65d9c08516c60f0202081"} Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:34:59.588362 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:34:59.687287 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5982x\" (UniqueName: \"kubernetes.io/projected/b4dcc55d-79fc-492b-980b-527f9a71a89c-kube-api-access-5982x\") pod \"b4dcc55d-79fc-492b-980b-527f9a71a89c\" (UID: \"b4dcc55d-79fc-492b-980b-527f9a71a89c\") " Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:34:59.693990 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4dcc55d-79fc-492b-980b-527f9a71a89c-kube-api-access-5982x" (OuterVolumeSpecName: "kube-api-access-5982x") pod "b4dcc55d-79fc-492b-980b-527f9a71a89c" (UID: "b4dcc55d-79fc-492b-980b-527f9a71a89c"). InnerVolumeSpecName "kube-api-access-5982x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:34:59.789115 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5982x\" (UniqueName: \"kubernetes.io/projected/b4dcc55d-79fc-492b-980b-527f9a71a89c-kube-api-access-5982x\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.484227 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b4dcc55d-79fc-492b-980b-527f9a71a89c","Type":"ContainerDied","Data":"63e1b818eeff3e0e6e3862a10daa3ff995376df9c6b852294c4d9665668972d7"} Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.484285 4742 scope.go:117] "RemoveContainer" containerID="341eb69fcfb599dc7de91fa2492483dfbc6fa83f21a65d9c08516c60f0202081" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.484347 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.525561 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.534884 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.594621 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 11:35:00 crc kubenswrapper[4742]: E0317 11:35:00.595064 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4dcc55d-79fc-492b-980b-527f9a71a89c" containerName="kube-state-metrics" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.595082 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4dcc55d-79fc-492b-980b-527f9a71a89c" containerName="kube-state-metrics" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.595263 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4dcc55d-79fc-492b-980b-527f9a71a89c" containerName="kube-state-metrics" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.595933 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.597984 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.598109 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.605845 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.676880 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4dcc55d-79fc-492b-980b-527f9a71a89c" path="/var/lib/kubelet/pods/b4dcc55d-79fc-492b-980b-527f9a71a89c/volumes" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.705163 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/47db9f5f-1a39-4137-bc97-bf3192c64ced-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"47db9f5f-1a39-4137-bc97-bf3192c64ced\") " pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.705222 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47db9f5f-1a39-4137-bc97-bf3192c64ced-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"47db9f5f-1a39-4137-bc97-bf3192c64ced\") " pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.705334 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vxxr\" (UniqueName: \"kubernetes.io/projected/47db9f5f-1a39-4137-bc97-bf3192c64ced-kube-api-access-4vxxr\") pod \"kube-state-metrics-0\" (UID: \"47db9f5f-1a39-4137-bc97-bf3192c64ced\") " pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.705383 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/47db9f5f-1a39-4137-bc97-bf3192c64ced-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"47db9f5f-1a39-4137-bc97-bf3192c64ced\") " pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.806653 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vxxr\" (UniqueName: \"kubernetes.io/projected/47db9f5f-1a39-4137-bc97-bf3192c64ced-kube-api-access-4vxxr\") pod \"kube-state-metrics-0\" (UID: \"47db9f5f-1a39-4137-bc97-bf3192c64ced\") " pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.806745 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/47db9f5f-1a39-4137-bc97-bf3192c64ced-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"47db9f5f-1a39-4137-bc97-bf3192c64ced\") " pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.807514 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/47db9f5f-1a39-4137-bc97-bf3192c64ced-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"47db9f5f-1a39-4137-bc97-bf3192c64ced\") " pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.807593 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47db9f5f-1a39-4137-bc97-bf3192c64ced-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"47db9f5f-1a39-4137-bc97-bf3192c64ced\") " pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.811527 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/47db9f5f-1a39-4137-bc97-bf3192c64ced-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"47db9f5f-1a39-4137-bc97-bf3192c64ced\") " pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.812171 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47db9f5f-1a39-4137-bc97-bf3192c64ced-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"47db9f5f-1a39-4137-bc97-bf3192c64ced\") " pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.812569 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/47db9f5f-1a39-4137-bc97-bf3192c64ced-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"47db9f5f-1a39-4137-bc97-bf3192c64ced\") " pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.829533 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vxxr\" (UniqueName: \"kubernetes.io/projected/47db9f5f-1a39-4137-bc97-bf3192c64ced-kube-api-access-4vxxr\") pod \"kube-state-metrics-0\" (UID: \"47db9f5f-1a39-4137-bc97-bf3192c64ced\") " pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.916159 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.952377 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.952772 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="ceilometer-central-agent" containerID="cri-o://2b7be7092615d2f9627c9bb0be3ad0a70df0cdc04fb9ecfaf80ff88584f3d28a" gracePeriod=30 Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.953386 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="proxy-httpd" containerID="cri-o://f194506c036e69d9442a09efbf9c930196c4920974d106053879cdab4935fee5" gracePeriod=30 Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.953456 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="sg-core" containerID="cri-o://20519f8eb41ddffbdd35517d36ad95844f9dfc611ffae1a481ddfbdf1a7723fa" gracePeriod=30 Mar 17 11:35:00 crc kubenswrapper[4742]: I0317 11:35:00.953500 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="ceilometer-notification-agent" containerID="cri-o://e08a58bd15adbd148ff4db09ba73a2e6dfd8e8873feb7e70224ff14bd5d80a1b" gracePeriod=30 Mar 17 11:35:01 crc kubenswrapper[4742]: W0317 11:35:01.181023 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47db9f5f_1a39_4137_bc97_bf3192c64ced.slice/crio-441bcfb5fa64b4a94e552cc45396b2c5399433ff5bf3e22922335ef29e7616a8 WatchSource:0}: Error finding container 441bcfb5fa64b4a94e552cc45396b2c5399433ff5bf3e22922335ef29e7616a8: Status 404 returned error can't find the container with id 441bcfb5fa64b4a94e552cc45396b2c5399433ff5bf3e22922335ef29e7616a8 Mar 17 11:35:01 crc kubenswrapper[4742]: I0317 11:35:01.181317 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 17 11:35:01 crc kubenswrapper[4742]: I0317 11:35:01.501988 4742 generic.go:334] "Generic (PLEG): container finished" podID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerID="f194506c036e69d9442a09efbf9c930196c4920974d106053879cdab4935fee5" exitCode=0 Mar 17 11:35:01 crc kubenswrapper[4742]: I0317 11:35:01.502354 4742 generic.go:334] "Generic (PLEG): container finished" podID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerID="20519f8eb41ddffbdd35517d36ad95844f9dfc611ffae1a481ddfbdf1a7723fa" exitCode=2 Mar 17 11:35:01 crc kubenswrapper[4742]: I0317 11:35:01.502371 4742 generic.go:334] "Generic (PLEG): container finished" podID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerID="2b7be7092615d2f9627c9bb0be3ad0a70df0cdc04fb9ecfaf80ff88584f3d28a" exitCode=0 Mar 17 11:35:01 crc kubenswrapper[4742]: I0317 11:35:01.502085 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0ea41b4-8c5a-42e8-b589-db1ac541b789","Type":"ContainerDied","Data":"f194506c036e69d9442a09efbf9c930196c4920974d106053879cdab4935fee5"} Mar 17 11:35:01 crc kubenswrapper[4742]: I0317 11:35:01.502501 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0ea41b4-8c5a-42e8-b589-db1ac541b789","Type":"ContainerDied","Data":"20519f8eb41ddffbdd35517d36ad95844f9dfc611ffae1a481ddfbdf1a7723fa"} Mar 17 11:35:01 crc kubenswrapper[4742]: I0317 11:35:01.502526 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0ea41b4-8c5a-42e8-b589-db1ac541b789","Type":"ContainerDied","Data":"2b7be7092615d2f9627c9bb0be3ad0a70df0cdc04fb9ecfaf80ff88584f3d28a"} Mar 17 11:35:01 crc kubenswrapper[4742]: I0317 11:35:01.507641 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"47db9f5f-1a39-4137-bc97-bf3192c64ced","Type":"ContainerStarted","Data":"441bcfb5fa64b4a94e552cc45396b2c5399433ff5bf3e22922335ef29e7616a8"} Mar 17 11:35:02 crc kubenswrapper[4742]: I0317 11:35:02.526450 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"47db9f5f-1a39-4137-bc97-bf3192c64ced","Type":"ContainerStarted","Data":"02b959a2780e789b4c32529e7171a3cbf6bb7ae00195136c0e30d35a72e429be"} Mar 17 11:35:02 crc kubenswrapper[4742]: I0317 11:35:02.526865 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 17 11:35:02 crc kubenswrapper[4742]: I0317 11:35:02.562611 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.122930745 podStartE2EDuration="2.562587541s" podCreationTimestamp="2026-03-17 11:35:00 +0000 UTC" firstStartedPulling="2026-03-17 11:35:01.184108765 +0000 UTC m=+1404.310236523" lastFinishedPulling="2026-03-17 11:35:01.623765561 +0000 UTC m=+1404.749893319" observedRunningTime="2026-03-17 11:35:02.552990873 +0000 UTC m=+1405.679118661" watchObservedRunningTime="2026-03-17 11:35:02.562587541 +0000 UTC m=+1405.688715299" Mar 17 11:35:02 crc kubenswrapper[4742]: I0317 11:35:02.824735 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 17 11:35:02 crc kubenswrapper[4742]: I0317 11:35:02.855214 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 17 11:35:03 crc kubenswrapper[4742]: I0317 11:35:03.378079 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 11:35:03 crc kubenswrapper[4742]: I0317 11:35:03.378461 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 11:35:03 crc kubenswrapper[4742]: I0317 11:35:03.570722 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.267865 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.376838 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0ea41b4-8c5a-42e8-b589-db1ac541b789-log-httpd\") pod \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.377045 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-scripts\") pod \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.377183 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0ea41b4-8c5a-42e8-b589-db1ac541b789-run-httpd\") pod \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.377231 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v74ct\" (UniqueName: \"kubernetes.io/projected/f0ea41b4-8c5a-42e8-b589-db1ac541b789-kube-api-access-v74ct\") pod \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.377264 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-combined-ca-bundle\") pod \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.377334 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-config-data\") pod \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.377364 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-sg-core-conf-yaml\") pod \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\" (UID: \"f0ea41b4-8c5a-42e8-b589-db1ac541b789\") " Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.378124 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f18526ff-17a2-445c-b949-5d4a129c7807" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.378189 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f18526ff-17a2-445c-b949-5d4a129c7807" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.377601 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ea41b4-8c5a-42e8-b589-db1ac541b789-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f0ea41b4-8c5a-42e8-b589-db1ac541b789" (UID: "f0ea41b4-8c5a-42e8-b589-db1ac541b789"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.378542 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ea41b4-8c5a-42e8-b589-db1ac541b789-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f0ea41b4-8c5a-42e8-b589-db1ac541b789" (UID: "f0ea41b4-8c5a-42e8-b589-db1ac541b789"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.383206 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-scripts" (OuterVolumeSpecName: "scripts") pod "f0ea41b4-8c5a-42e8-b589-db1ac541b789" (UID: "f0ea41b4-8c5a-42e8-b589-db1ac541b789"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.383276 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ea41b4-8c5a-42e8-b589-db1ac541b789-kube-api-access-v74ct" (OuterVolumeSpecName: "kube-api-access-v74ct") pod "f0ea41b4-8c5a-42e8-b589-db1ac541b789" (UID: "f0ea41b4-8c5a-42e8-b589-db1ac541b789"). InnerVolumeSpecName "kube-api-access-v74ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.461195 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0ea41b4-8c5a-42e8-b589-db1ac541b789" (UID: "f0ea41b4-8c5a-42e8-b589-db1ac541b789"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.478583 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-config-data" (OuterVolumeSpecName: "config-data") pod "f0ea41b4-8c5a-42e8-b589-db1ac541b789" (UID: "f0ea41b4-8c5a-42e8-b589-db1ac541b789"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.479739 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.479754 4742 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0ea41b4-8c5a-42e8-b589-db1ac541b789-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.479763 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.479771 4742 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0ea41b4-8c5a-42e8-b589-db1ac541b789-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.479781 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v74ct\" (UniqueName: \"kubernetes.io/projected/f0ea41b4-8c5a-42e8-b589-db1ac541b789-kube-api-access-v74ct\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.479792 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.500756 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f0ea41b4-8c5a-42e8-b589-db1ac541b789" (UID: "f0ea41b4-8c5a-42e8-b589-db1ac541b789"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.550722 4742 generic.go:334] "Generic (PLEG): container finished" podID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerID="e08a58bd15adbd148ff4db09ba73a2e6dfd8e8873feb7e70224ff14bd5d80a1b" exitCode=0 Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.551814 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.553476 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0ea41b4-8c5a-42e8-b589-db1ac541b789","Type":"ContainerDied","Data":"e08a58bd15adbd148ff4db09ba73a2e6dfd8e8873feb7e70224ff14bd5d80a1b"} Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.553515 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0ea41b4-8c5a-42e8-b589-db1ac541b789","Type":"ContainerDied","Data":"4bd7b6d149488424331d085373296789fc3431cf57e2dcf477078853a6608686"} Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.553532 4742 scope.go:117] "RemoveContainer" containerID="f194506c036e69d9442a09efbf9c930196c4920974d106053879cdab4935fee5" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.588121 4742 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0ea41b4-8c5a-42e8-b589-db1ac541b789-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.589952 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.597659 4742 scope.go:117] "RemoveContainer" containerID="20519f8eb41ddffbdd35517d36ad95844f9dfc611ffae1a481ddfbdf1a7723fa" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.600162 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.624853 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:04 crc kubenswrapper[4742]: E0317 11:35:04.625360 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="ceilometer-notification-agent" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.625384 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="ceilometer-notification-agent" Mar 17 11:35:04 crc kubenswrapper[4742]: E0317 11:35:04.625397 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="proxy-httpd" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.625406 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="proxy-httpd" Mar 17 11:35:04 crc kubenswrapper[4742]: E0317 11:35:04.625420 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="ceilometer-central-agent" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.625429 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="ceilometer-central-agent" Mar 17 11:35:04 crc kubenswrapper[4742]: E0317 11:35:04.625446 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="sg-core" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.625454 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="sg-core" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.625664 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="ceilometer-central-agent" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.625693 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="proxy-httpd" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.625715 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="ceilometer-notification-agent" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.625731 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" containerName="sg-core" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.626004 4742 scope.go:117] "RemoveContainer" containerID="e08a58bd15adbd148ff4db09ba73a2e6dfd8e8873feb7e70224ff14bd5d80a1b" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.629407 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.631702 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.632015 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.632167 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.662322 4742 scope.go:117] "RemoveContainer" containerID="2b7be7092615d2f9627c9bb0be3ad0a70df0cdc04fb9ecfaf80ff88584f3d28a" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.679630 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0ea41b4-8c5a-42e8-b589-db1ac541b789" path="/var/lib/kubelet/pods/f0ea41b4-8c5a-42e8-b589-db1ac541b789/volumes" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.680570 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.690472 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-log-httpd\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.690667 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4xb5\" (UniqueName: \"kubernetes.io/projected/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-kube-api-access-l4xb5\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.690872 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.690954 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.691001 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-run-httpd\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.691113 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-scripts\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.691168 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.691259 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-config-data\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.696400 4742 scope.go:117] "RemoveContainer" containerID="f194506c036e69d9442a09efbf9c930196c4920974d106053879cdab4935fee5" Mar 17 11:35:04 crc kubenswrapper[4742]: E0317 11:35:04.696999 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f194506c036e69d9442a09efbf9c930196c4920974d106053879cdab4935fee5\": container with ID starting with f194506c036e69d9442a09efbf9c930196c4920974d106053879cdab4935fee5 not found: ID does not exist" containerID="f194506c036e69d9442a09efbf9c930196c4920974d106053879cdab4935fee5" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.697037 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f194506c036e69d9442a09efbf9c930196c4920974d106053879cdab4935fee5"} err="failed to get container status \"f194506c036e69d9442a09efbf9c930196c4920974d106053879cdab4935fee5\": rpc error: code = NotFound desc = could not find container \"f194506c036e69d9442a09efbf9c930196c4920974d106053879cdab4935fee5\": container with ID starting with f194506c036e69d9442a09efbf9c930196c4920974d106053879cdab4935fee5 not found: ID does not exist" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.697061 4742 scope.go:117] "RemoveContainer" containerID="20519f8eb41ddffbdd35517d36ad95844f9dfc611ffae1a481ddfbdf1a7723fa" Mar 17 11:35:04 crc kubenswrapper[4742]: E0317 11:35:04.697559 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20519f8eb41ddffbdd35517d36ad95844f9dfc611ffae1a481ddfbdf1a7723fa\": container with ID starting with 20519f8eb41ddffbdd35517d36ad95844f9dfc611ffae1a481ddfbdf1a7723fa not found: ID does not exist" containerID="20519f8eb41ddffbdd35517d36ad95844f9dfc611ffae1a481ddfbdf1a7723fa" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.697603 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20519f8eb41ddffbdd35517d36ad95844f9dfc611ffae1a481ddfbdf1a7723fa"} err="failed to get container status \"20519f8eb41ddffbdd35517d36ad95844f9dfc611ffae1a481ddfbdf1a7723fa\": rpc error: code = NotFound desc = could not find container \"20519f8eb41ddffbdd35517d36ad95844f9dfc611ffae1a481ddfbdf1a7723fa\": container with ID starting with 20519f8eb41ddffbdd35517d36ad95844f9dfc611ffae1a481ddfbdf1a7723fa not found: ID does not exist" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.697627 4742 scope.go:117] "RemoveContainer" containerID="e08a58bd15adbd148ff4db09ba73a2e6dfd8e8873feb7e70224ff14bd5d80a1b" Mar 17 11:35:04 crc kubenswrapper[4742]: E0317 11:35:04.698127 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e08a58bd15adbd148ff4db09ba73a2e6dfd8e8873feb7e70224ff14bd5d80a1b\": container with ID starting with e08a58bd15adbd148ff4db09ba73a2e6dfd8e8873feb7e70224ff14bd5d80a1b not found: ID does not exist" containerID="e08a58bd15adbd148ff4db09ba73a2e6dfd8e8873feb7e70224ff14bd5d80a1b" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.698328 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08a58bd15adbd148ff4db09ba73a2e6dfd8e8873feb7e70224ff14bd5d80a1b"} err="failed to get container status \"e08a58bd15adbd148ff4db09ba73a2e6dfd8e8873feb7e70224ff14bd5d80a1b\": rpc error: code = NotFound desc = could not find container \"e08a58bd15adbd148ff4db09ba73a2e6dfd8e8873feb7e70224ff14bd5d80a1b\": container with ID starting with e08a58bd15adbd148ff4db09ba73a2e6dfd8e8873feb7e70224ff14bd5d80a1b not found: ID does not exist" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.698352 4742 scope.go:117] "RemoveContainer" containerID="2b7be7092615d2f9627c9bb0be3ad0a70df0cdc04fb9ecfaf80ff88584f3d28a" Mar 17 11:35:04 crc kubenswrapper[4742]: E0317 11:35:04.698794 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b7be7092615d2f9627c9bb0be3ad0a70df0cdc04fb9ecfaf80ff88584f3d28a\": container with ID starting with 2b7be7092615d2f9627c9bb0be3ad0a70df0cdc04fb9ecfaf80ff88584f3d28a not found: ID does not exist" containerID="2b7be7092615d2f9627c9bb0be3ad0a70df0cdc04fb9ecfaf80ff88584f3d28a" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.698833 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b7be7092615d2f9627c9bb0be3ad0a70df0cdc04fb9ecfaf80ff88584f3d28a"} err="failed to get container status \"2b7be7092615d2f9627c9bb0be3ad0a70df0cdc04fb9ecfaf80ff88584f3d28a\": rpc error: code = NotFound desc = could not find container \"2b7be7092615d2f9627c9bb0be3ad0a70df0cdc04fb9ecfaf80ff88584f3d28a\": container with ID starting with 2b7be7092615d2f9627c9bb0be3ad0a70df0cdc04fb9ecfaf80ff88584f3d28a not found: ID does not exist" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.740017 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.741151 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.793207 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.793284 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-config-data\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.793400 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-log-httpd\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.793478 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4xb5\" (UniqueName: \"kubernetes.io/projected/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-kube-api-access-l4xb5\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.793521 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.793563 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.793592 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-run-httpd\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.793648 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-scripts\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.794824 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-log-httpd\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.798556 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-run-httpd\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.801408 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.802036 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.805842 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-scripts\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.806670 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.806959 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-config-data\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.825954 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4xb5\" (UniqueName: \"kubernetes.io/projected/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-kube-api-access-l4xb5\") pod \"ceilometer-0\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " pod="openstack/ceilometer-0" Mar 17 11:35:04 crc kubenswrapper[4742]: I0317 11:35:04.950025 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:35:05 crc kubenswrapper[4742]: I0317 11:35:05.428794 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:05 crc kubenswrapper[4742]: I0317 11:35:05.568208 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c","Type":"ContainerStarted","Data":"485c557eda21ccae5de3d28b3dff27432b289140d9a5a72c5b8ac1a8bc165df8"} Mar 17 11:35:06 crc kubenswrapper[4742]: I0317 11:35:06.744691 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 17 11:35:06 crc kubenswrapper[4742]: I0317 11:35:06.745372 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 17 11:35:06 crc kubenswrapper[4742]: I0317 11:35:06.750708 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 17 11:35:06 crc kubenswrapper[4742]: I0317 11:35:06.752189 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 17 11:35:07 crc kubenswrapper[4742]: I0317 11:35:07.595923 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c","Type":"ContainerStarted","Data":"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d"} Mar 17 11:35:08 crc kubenswrapper[4742]: I0317 11:35:08.607240 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c","Type":"ContainerStarted","Data":"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950"} Mar 17 11:35:08 crc kubenswrapper[4742]: I0317 11:35:08.607499 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c","Type":"ContainerStarted","Data":"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383"} Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.459760 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.512756 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz64d\" (UniqueName: \"kubernetes.io/projected/7ea48ccf-3d8b-43ec-a543-44f0217629b5-kube-api-access-bz64d\") pod \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\" (UID: \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\") " Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.513185 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea48ccf-3d8b-43ec-a543-44f0217629b5-config-data\") pod \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\" (UID: \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\") " Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.513316 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea48ccf-3d8b-43ec-a543-44f0217629b5-combined-ca-bundle\") pod \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\" (UID: \"7ea48ccf-3d8b-43ec-a543-44f0217629b5\") " Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.522289 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea48ccf-3d8b-43ec-a543-44f0217629b5-kube-api-access-bz64d" (OuterVolumeSpecName: "kube-api-access-bz64d") pod "7ea48ccf-3d8b-43ec-a543-44f0217629b5" (UID: "7ea48ccf-3d8b-43ec-a543-44f0217629b5"). InnerVolumeSpecName "kube-api-access-bz64d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.540729 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea48ccf-3d8b-43ec-a543-44f0217629b5-config-data" (OuterVolumeSpecName: "config-data") pod "7ea48ccf-3d8b-43ec-a543-44f0217629b5" (UID: "7ea48ccf-3d8b-43ec-a543-44f0217629b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.565298 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea48ccf-3d8b-43ec-a543-44f0217629b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ea48ccf-3d8b-43ec-a543-44f0217629b5" (UID: "7ea48ccf-3d8b-43ec-a543-44f0217629b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.616077 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea48ccf-3d8b-43ec-a543-44f0217629b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.616105 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea48ccf-3d8b-43ec-a543-44f0217629b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.616116 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz64d\" (UniqueName: \"kubernetes.io/projected/7ea48ccf-3d8b-43ec-a543-44f0217629b5-kube-api-access-bz64d\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.625618 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.625575 4742 generic.go:334] "Generic (PLEG): container finished" podID="7ea48ccf-3d8b-43ec-a543-44f0217629b5" containerID="d00ef0e9f48c07fbf65c07740aa130a3e125491bb81b8b0b5b44c29082bda891" exitCode=137 Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.625708 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7ea48ccf-3d8b-43ec-a543-44f0217629b5","Type":"ContainerDied","Data":"d00ef0e9f48c07fbf65c07740aa130a3e125491bb81b8b0b5b44c29082bda891"} Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.625760 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7ea48ccf-3d8b-43ec-a543-44f0217629b5","Type":"ContainerDied","Data":"236eddf691ccf858ef6473ff255a927df6256a3987c1982b42b68ab2a7179661"} Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.625785 4742 scope.go:117] "RemoveContainer" containerID="d00ef0e9f48c07fbf65c07740aa130a3e125491bb81b8b0b5b44c29082bda891" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.631590 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c","Type":"ContainerStarted","Data":"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d"} Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.632289 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.648741 4742 scope.go:117] "RemoveContainer" containerID="d00ef0e9f48c07fbf65c07740aa130a3e125491bb81b8b0b5b44c29082bda891" Mar 17 11:35:10 crc kubenswrapper[4742]: E0317 11:35:10.649154 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d00ef0e9f48c07fbf65c07740aa130a3e125491bb81b8b0b5b44c29082bda891\": container with ID starting with d00ef0e9f48c07fbf65c07740aa130a3e125491bb81b8b0b5b44c29082bda891 not found: ID does not exist" containerID="d00ef0e9f48c07fbf65c07740aa130a3e125491bb81b8b0b5b44c29082bda891" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.649191 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d00ef0e9f48c07fbf65c07740aa130a3e125491bb81b8b0b5b44c29082bda891"} err="failed to get container status \"d00ef0e9f48c07fbf65c07740aa130a3e125491bb81b8b0b5b44c29082bda891\": rpc error: code = NotFound desc = could not find container \"d00ef0e9f48c07fbf65c07740aa130a3e125491bb81b8b0b5b44c29082bda891\": container with ID starting with d00ef0e9f48c07fbf65c07740aa130a3e125491bb81b8b0b5b44c29082bda891 not found: ID does not exist" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.656805 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.197157598 podStartE2EDuration="6.656785013s" podCreationTimestamp="2026-03-17 11:35:04 +0000 UTC" firstStartedPulling="2026-03-17 11:35:05.422594376 +0000 UTC m=+1408.548722134" lastFinishedPulling="2026-03-17 11:35:09.882221791 +0000 UTC m=+1413.008349549" observedRunningTime="2026-03-17 11:35:10.653749798 +0000 UTC m=+1413.779877576" watchObservedRunningTime="2026-03-17 11:35:10.656785013 +0000 UTC m=+1413.782912791" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.716491 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.729015 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.735979 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 11:35:10 crc kubenswrapper[4742]: E0317 11:35:10.736418 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea48ccf-3d8b-43ec-a543-44f0217629b5" containerName="nova-cell1-novncproxy-novncproxy" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.736435 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea48ccf-3d8b-43ec-a543-44f0217629b5" containerName="nova-cell1-novncproxy-novncproxy" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.736618 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea48ccf-3d8b-43ec-a543-44f0217629b5" containerName="nova-cell1-novncproxy-novncproxy" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.737362 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.742639 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.747041 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.748787 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.748993 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.819461 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th6tz\" (UniqueName: \"kubernetes.io/projected/00a7a363-ec82-40a4-8121-fd6839727132-kube-api-access-th6tz\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.819516 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a7a363-ec82-40a4-8121-fd6839727132-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.819593 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a7a363-ec82-40a4-8121-fd6839727132-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.819612 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a7a363-ec82-40a4-8121-fd6839727132-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.819862 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a7a363-ec82-40a4-8121-fd6839727132-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.921748 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a7a363-ec82-40a4-8121-fd6839727132-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.921788 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a7a363-ec82-40a4-8121-fd6839727132-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.921878 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a7a363-ec82-40a4-8121-fd6839727132-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.921958 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th6tz\" (UniqueName: \"kubernetes.io/projected/00a7a363-ec82-40a4-8121-fd6839727132-kube-api-access-th6tz\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.921999 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a7a363-ec82-40a4-8121-fd6839727132-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.926151 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.927695 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a7a363-ec82-40a4-8121-fd6839727132-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.928341 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a7a363-ec82-40a4-8121-fd6839727132-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.928393 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a7a363-ec82-40a4-8121-fd6839727132-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.930260 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a7a363-ec82-40a4-8121-fd6839727132-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:10 crc kubenswrapper[4742]: I0317 11:35:10.954808 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th6tz\" (UniqueName: \"kubernetes.io/projected/00a7a363-ec82-40a4-8121-fd6839727132-kube-api-access-th6tz\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a7a363-ec82-40a4-8121-fd6839727132\") " pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:11 crc kubenswrapper[4742]: I0317 11:35:11.062418 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:11 crc kubenswrapper[4742]: I0317 11:35:11.377235 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 17 11:35:11 crc kubenswrapper[4742]: I0317 11:35:11.377512 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 17 11:35:11 crc kubenswrapper[4742]: I0317 11:35:11.501610 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 17 11:35:11 crc kubenswrapper[4742]: W0317 11:35:11.505524 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00a7a363_ec82_40a4_8121_fd6839727132.slice/crio-c6c87c7fdaec5c798d24039d132989edd9e3a73fe78d5075688a720f6b60f082 WatchSource:0}: Error finding container c6c87c7fdaec5c798d24039d132989edd9e3a73fe78d5075688a720f6b60f082: Status 404 returned error can't find the container with id c6c87c7fdaec5c798d24039d132989edd9e3a73fe78d5075688a720f6b60f082 Mar 17 11:35:11 crc kubenswrapper[4742]: I0317 11:35:11.645557 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"00a7a363-ec82-40a4-8121-fd6839727132","Type":"ContainerStarted","Data":"c6c87c7fdaec5c798d24039d132989edd9e3a73fe78d5075688a720f6b60f082"} Mar 17 11:35:12 crc kubenswrapper[4742]: I0317 11:35:12.698995 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.698977321 podStartE2EDuration="2.698977321s" podCreationTimestamp="2026-03-17 11:35:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:35:12.691394599 +0000 UTC m=+1415.817522357" watchObservedRunningTime="2026-03-17 11:35:12.698977321 +0000 UTC m=+1415.825105079" Mar 17 11:35:12 crc kubenswrapper[4742]: I0317 11:35:12.699348 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ea48ccf-3d8b-43ec-a543-44f0217629b5" path="/var/lib/kubelet/pods/7ea48ccf-3d8b-43ec-a543-44f0217629b5/volumes" Mar 17 11:35:12 crc kubenswrapper[4742]: I0317 11:35:12.700005 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"00a7a363-ec82-40a4-8121-fd6839727132","Type":"ContainerStarted","Data":"2c179742a5d8296de0dccb63e2ba96751a57d7f85d1956f0471c74d7c72f4c97"} Mar 17 11:35:13 crc kubenswrapper[4742]: I0317 11:35:13.383437 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 17 11:35:13 crc kubenswrapper[4742]: I0317 11:35:13.385747 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 17 11:35:13 crc kubenswrapper[4742]: I0317 11:35:13.389901 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 17 11:35:13 crc kubenswrapper[4742]: I0317 11:35:13.686522 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 17 11:35:13 crc kubenswrapper[4742]: I0317 11:35:13.899323 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-dt4p6"] Mar 17 11:35:13 crc kubenswrapper[4742]: I0317 11:35:13.901488 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:13 crc kubenswrapper[4742]: I0317 11:35:13.920266 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-dt4p6"] Mar 17 11:35:13 crc kubenswrapper[4742]: I0317 11:35:13.991495 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-config\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:13 crc kubenswrapper[4742]: I0317 11:35:13.991554 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:13 crc kubenswrapper[4742]: I0317 11:35:13.991588 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:13 crc kubenswrapper[4742]: I0317 11:35:13.991616 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qhdj\" (UniqueName: \"kubernetes.io/projected/ab1d5568-a84d-4397-b93c-6b997192fb30-kube-api-access-6qhdj\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:13 crc kubenswrapper[4742]: I0317 11:35:13.991719 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:13 crc kubenswrapper[4742]: I0317 11:35:13.991837 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.093634 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.093713 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-config\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.093748 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.093788 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.093817 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qhdj\" (UniqueName: \"kubernetes.io/projected/ab1d5568-a84d-4397-b93c-6b997192fb30-kube-api-access-6qhdj\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.093932 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.094769 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-config\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.095530 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.095661 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.098671 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.099186 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.118567 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qhdj\" (UniqueName: \"kubernetes.io/projected/ab1d5568-a84d-4397-b93c-6b997192fb30-kube-api-access-6qhdj\") pod \"dnsmasq-dns-89c5cd4d5-dt4p6\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.249297 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:14 crc kubenswrapper[4742]: I0317 11:35:14.724415 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-dt4p6"] Mar 17 11:35:15 crc kubenswrapper[4742]: I0317 11:35:15.700567 4742 generic.go:334] "Generic (PLEG): container finished" podID="ab1d5568-a84d-4397-b93c-6b997192fb30" containerID="09caa62f03751fee6040912acc017bda1b9300662caec89f6cadc815b78a79b6" exitCode=0 Mar 17 11:35:15 crc kubenswrapper[4742]: I0317 11:35:15.700642 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" event={"ID":"ab1d5568-a84d-4397-b93c-6b997192fb30","Type":"ContainerDied","Data":"09caa62f03751fee6040912acc017bda1b9300662caec89f6cadc815b78a79b6"} Mar 17 11:35:15 crc kubenswrapper[4742]: I0317 11:35:15.701113 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" event={"ID":"ab1d5568-a84d-4397-b93c-6b997192fb30","Type":"ContainerStarted","Data":"0ce4b0bb2effdc02bb36ab1633384d667ff048fb9218b3beaccb3b4d1e5c5b9a"} Mar 17 11:35:16 crc kubenswrapper[4742]: I0317 11:35:16.063461 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:16 crc kubenswrapper[4742]: I0317 11:35:16.477794 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:35:16 crc kubenswrapper[4742]: I0317 11:35:16.610153 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:16 crc kubenswrapper[4742]: I0317 11:35:16.610467 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="ceilometer-central-agent" containerID="cri-o://88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d" gracePeriod=30 Mar 17 11:35:16 crc kubenswrapper[4742]: I0317 11:35:16.610572 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="sg-core" containerID="cri-o://adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950" gracePeriod=30 Mar 17 11:35:16 crc kubenswrapper[4742]: I0317 11:35:16.610607 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="ceilometer-notification-agent" containerID="cri-o://8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383" gracePeriod=30 Mar 17 11:35:16 crc kubenswrapper[4742]: I0317 11:35:16.610642 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="proxy-httpd" containerID="cri-o://d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d" gracePeriod=30 Mar 17 11:35:16 crc kubenswrapper[4742]: I0317 11:35:16.711750 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f18526ff-17a2-445c-b949-5d4a129c7807" containerName="nova-api-log" containerID="cri-o://4bf89a006ab374cf59150d6248aa7aeb9f96c6cf25ed7c8c89f6f8ee8dedac9e" gracePeriod=30 Mar 17 11:35:16 crc kubenswrapper[4742]: I0317 11:35:16.712770 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" event={"ID":"ab1d5568-a84d-4397-b93c-6b997192fb30","Type":"ContainerStarted","Data":"9fa46cb0281903ea2740e889c78f20f753154a8df89b1ea8118a440585c6bd72"} Mar 17 11:35:16 crc kubenswrapper[4742]: I0317 11:35:16.712817 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:16 crc kubenswrapper[4742]: I0317 11:35:16.713236 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f18526ff-17a2-445c-b949-5d4a129c7807" containerName="nova-api-api" containerID="cri-o://ab024a1a847ec4183e2d6f4123e82c16eba58bff1b1288427fb43a029682c7a6" gracePeriod=30 Mar 17 11:35:16 crc kubenswrapper[4742]: I0317 11:35:16.752081 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" podStartSLOduration=3.752059623 podStartE2EDuration="3.752059623s" podCreationTimestamp="2026-03-17 11:35:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:35:16.734220956 +0000 UTC m=+1419.860348714" watchObservedRunningTime="2026-03-17 11:35:16.752059623 +0000 UTC m=+1419.878187381" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.650199 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.678488 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4xb5\" (UniqueName: \"kubernetes.io/projected/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-kube-api-access-l4xb5\") pod \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.678639 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-sg-core-conf-yaml\") pod \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.678790 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-run-httpd\") pod \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.678832 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-combined-ca-bundle\") pod \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.678863 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-scripts\") pod \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.678977 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-log-httpd\") pod \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.679012 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-config-data\") pod \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.679056 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-ceilometer-tls-certs\") pod \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\" (UID: \"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c\") " Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.679253 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" (UID: "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.679896 4742 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.679896 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" (UID: "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.685087 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-scripts" (OuterVolumeSpecName: "scripts") pod "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" (UID: "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.687230 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-kube-api-access-l4xb5" (OuterVolumeSpecName: "kube-api-access-l4xb5") pod "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" (UID: "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c"). InnerVolumeSpecName "kube-api-access-l4xb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.714033 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" (UID: "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.723094 4742 generic.go:334] "Generic (PLEG): container finished" podID="f18526ff-17a2-445c-b949-5d4a129c7807" containerID="4bf89a006ab374cf59150d6248aa7aeb9f96c6cf25ed7c8c89f6f8ee8dedac9e" exitCode=143 Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.723156 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f18526ff-17a2-445c-b949-5d4a129c7807","Type":"ContainerDied","Data":"4bf89a006ab374cf59150d6248aa7aeb9f96c6cf25ed7c8c89f6f8ee8dedac9e"} Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.731614 4742 generic.go:334] "Generic (PLEG): container finished" podID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerID="d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d" exitCode=0 Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.731649 4742 generic.go:334] "Generic (PLEG): container finished" podID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerID="adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950" exitCode=2 Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.731657 4742 generic.go:334] "Generic (PLEG): container finished" podID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerID="8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383" exitCode=0 Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.731664 4742 generic.go:334] "Generic (PLEG): container finished" podID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerID="88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d" exitCode=0 Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.731681 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.731757 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c","Type":"ContainerDied","Data":"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d"} Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.731784 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c","Type":"ContainerDied","Data":"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950"} Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.731794 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c","Type":"ContainerDied","Data":"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383"} Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.731806 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c","Type":"ContainerDied","Data":"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d"} Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.731815 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c","Type":"ContainerDied","Data":"485c557eda21ccae5de3d28b3dff27432b289140d9a5a72c5b8ac1a8bc165df8"} Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.731832 4742 scope.go:117] "RemoveContainer" containerID="d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.750010 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" (UID: "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.781068 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4xb5\" (UniqueName: \"kubernetes.io/projected/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-kube-api-access-l4xb5\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.781096 4742 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.781105 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.781114 4742 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.781122 4742 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.790956 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" (UID: "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.819140 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-config-data" (OuterVolumeSpecName: "config-data") pod "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" (UID: "8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.883125 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.883154 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.900693 4742 scope.go:117] "RemoveContainer" containerID="adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.921998 4742 scope.go:117] "RemoveContainer" containerID="8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.947121 4742 scope.go:117] "RemoveContainer" containerID="88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.984472 4742 scope.go:117] "RemoveContainer" containerID="d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d" Mar 17 11:35:17 crc kubenswrapper[4742]: E0317 11:35:17.984874 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d\": container with ID starting with d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d not found: ID does not exist" containerID="d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.984936 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d"} err="failed to get container status \"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d\": rpc error: code = NotFound desc = could not find container \"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d\": container with ID starting with d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.984967 4742 scope.go:117] "RemoveContainer" containerID="adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950" Mar 17 11:35:17 crc kubenswrapper[4742]: E0317 11:35:17.985198 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950\": container with ID starting with adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950 not found: ID does not exist" containerID="adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.985220 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950"} err="failed to get container status \"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950\": rpc error: code = NotFound desc = could not find container \"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950\": container with ID starting with adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950 not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.985261 4742 scope.go:117] "RemoveContainer" containerID="8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383" Mar 17 11:35:17 crc kubenswrapper[4742]: E0317 11:35:17.985558 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383\": container with ID starting with 8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383 not found: ID does not exist" containerID="8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.985582 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383"} err="failed to get container status \"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383\": rpc error: code = NotFound desc = could not find container \"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383\": container with ID starting with 8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383 not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.985595 4742 scope.go:117] "RemoveContainer" containerID="88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d" Mar 17 11:35:17 crc kubenswrapper[4742]: E0317 11:35:17.985836 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d\": container with ID starting with 88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d not found: ID does not exist" containerID="88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.985884 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d"} err="failed to get container status \"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d\": rpc error: code = NotFound desc = could not find container \"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d\": container with ID starting with 88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.985933 4742 scope.go:117] "RemoveContainer" containerID="d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.986237 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d"} err="failed to get container status \"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d\": rpc error: code = NotFound desc = could not find container \"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d\": container with ID starting with d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.986262 4742 scope.go:117] "RemoveContainer" containerID="adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.986475 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950"} err="failed to get container status \"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950\": rpc error: code = NotFound desc = could not find container \"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950\": container with ID starting with adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950 not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.986497 4742 scope.go:117] "RemoveContainer" containerID="8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.986659 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383"} err="failed to get container status \"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383\": rpc error: code = NotFound desc = could not find container \"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383\": container with ID starting with 8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383 not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.986679 4742 scope.go:117] "RemoveContainer" containerID="88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.986851 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d"} err="failed to get container status \"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d\": rpc error: code = NotFound desc = could not find container \"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d\": container with ID starting with 88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.986870 4742 scope.go:117] "RemoveContainer" containerID="d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.987267 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d"} err="failed to get container status \"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d\": rpc error: code = NotFound desc = could not find container \"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d\": container with ID starting with d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.987306 4742 scope.go:117] "RemoveContainer" containerID="adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.987495 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950"} err="failed to get container status \"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950\": rpc error: code = NotFound desc = could not find container \"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950\": container with ID starting with adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950 not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.987520 4742 scope.go:117] "RemoveContainer" containerID="8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.987762 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383"} err="failed to get container status \"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383\": rpc error: code = NotFound desc = could not find container \"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383\": container with ID starting with 8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383 not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.987791 4742 scope.go:117] "RemoveContainer" containerID="88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.988031 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d"} err="failed to get container status \"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d\": rpc error: code = NotFound desc = could not find container \"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d\": container with ID starting with 88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.988055 4742 scope.go:117] "RemoveContainer" containerID="d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.988260 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d"} err="failed to get container status \"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d\": rpc error: code = NotFound desc = could not find container \"d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d\": container with ID starting with d281391c2d9e3a379b08c41c75163348bf9c78fcedbc8ff544f09e1144cb676d not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.988284 4742 scope.go:117] "RemoveContainer" containerID="adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.988440 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950"} err="failed to get container status \"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950\": rpc error: code = NotFound desc = could not find container \"adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950\": container with ID starting with adbf1bcb8706e02926f78dc40090687b4cdde358fc16dd62eff5383ef066e950 not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.988457 4742 scope.go:117] "RemoveContainer" containerID="8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.988598 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383"} err="failed to get container status \"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383\": rpc error: code = NotFound desc = could not find container \"8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383\": container with ID starting with 8b4326aec443d0764dd96f77b6ffe9467e0640291198457ce744e5dedcb65383 not found: ID does not exist" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.988610 4742 scope.go:117] "RemoveContainer" containerID="88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d" Mar 17 11:35:17 crc kubenswrapper[4742]: I0317 11:35:17.988742 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d"} err="failed to get container status \"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d\": rpc error: code = NotFound desc = could not find container \"88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d\": container with ID starting with 88f43a80e6ee9a1b7491084f5c76930b6c65f8b31fbaca63c48c78df04073e9d not found: ID does not exist" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.043834 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.043922 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.043977 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.044751 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1aeee9892509f65c6f012471968b84d5122ab43ea074794d2d7aecfdfae8d433"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.044813 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://1aeee9892509f65c6f012471968b84d5122ab43ea074794d2d7aecfdfae8d433" gracePeriod=600 Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.064843 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.082798 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.110862 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:18 crc kubenswrapper[4742]: E0317 11:35:18.111488 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="ceilometer-notification-agent" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.111517 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="ceilometer-notification-agent" Mar 17 11:35:18 crc kubenswrapper[4742]: E0317 11:35:18.111538 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="proxy-httpd" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.111553 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="proxy-httpd" Mar 17 11:35:18 crc kubenswrapper[4742]: E0317 11:35:18.111572 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="ceilometer-central-agent" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.111585 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="ceilometer-central-agent" Mar 17 11:35:18 crc kubenswrapper[4742]: E0317 11:35:18.111621 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="sg-core" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.111635 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="sg-core" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.111980 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="ceilometer-notification-agent" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.112017 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="ceilometer-central-agent" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.112039 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="sg-core" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.112061 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" containerName="proxy-httpd" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.115324 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.121652 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.122455 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.122718 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.122828 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.190835 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.190870 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-scripts\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.190921 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dldkk\" (UniqueName: \"kubernetes.io/projected/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-kube-api-access-dldkk\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.190967 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.190996 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-config-data\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.191019 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-log-httpd\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.191036 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-run-httpd\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.191062 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.292100 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dldkk\" (UniqueName: \"kubernetes.io/projected/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-kube-api-access-dldkk\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.292467 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.292508 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-config-data\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.292565 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-log-httpd\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.292586 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-run-httpd\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.292634 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.292708 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.292734 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-scripts\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.293494 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-run-httpd\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.293581 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-log-httpd\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.305362 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.305390 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.305476 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.306857 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-config-data\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.308734 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-scripts\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.312594 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dldkk\" (UniqueName: \"kubernetes.io/projected/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-kube-api-access-dldkk\") pod \"ceilometer-0\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.417847 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.418649 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.680843 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c" path="/var/lib/kubelet/pods/8c3d80ae-cfe0-4fb8-80cc-42caff3b1d7c/volumes" Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.746240 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="1aeee9892509f65c6f012471968b84d5122ab43ea074794d2d7aecfdfae8d433" exitCode=0 Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.746443 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"1aeee9892509f65c6f012471968b84d5122ab43ea074794d2d7aecfdfae8d433"} Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.746551 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d"} Mar 17 11:35:18 crc kubenswrapper[4742]: I0317 11:35:18.746585 4742 scope.go:117] "RemoveContainer" containerID="a5ef1667f2e6dd9db693993b9f4f126e4ca6164458a0fe8e5b3f3f6b5159b8d2" Mar 17 11:35:19 crc kubenswrapper[4742]: I0317 11:35:19.004827 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:19 crc kubenswrapper[4742]: I0317 11:35:19.767325 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1","Type":"ContainerStarted","Data":"529f6308d569c883e9c969f479760566848fa03f627508d3e7662e7521abd424"} Mar 17 11:35:19 crc kubenswrapper[4742]: I0317 11:35:19.767360 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1","Type":"ContainerStarted","Data":"ae8a2b14142fadd2b21171b313b2155ee6ee1b5ebfdd8fb088add351b7a6eb19"} Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.440590 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.643197 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f18526ff-17a2-445c-b949-5d4a129c7807-combined-ca-bundle\") pod \"f18526ff-17a2-445c-b949-5d4a129c7807\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.643271 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f18526ff-17a2-445c-b949-5d4a129c7807-config-data\") pod \"f18526ff-17a2-445c-b949-5d4a129c7807\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.643440 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh77h\" (UniqueName: \"kubernetes.io/projected/f18526ff-17a2-445c-b949-5d4a129c7807-kube-api-access-hh77h\") pod \"f18526ff-17a2-445c-b949-5d4a129c7807\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.643500 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f18526ff-17a2-445c-b949-5d4a129c7807-logs\") pod \"f18526ff-17a2-445c-b949-5d4a129c7807\" (UID: \"f18526ff-17a2-445c-b949-5d4a129c7807\") " Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.644034 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f18526ff-17a2-445c-b949-5d4a129c7807-logs" (OuterVolumeSpecName: "logs") pod "f18526ff-17a2-445c-b949-5d4a129c7807" (UID: "f18526ff-17a2-445c-b949-5d4a129c7807"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.668299 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f18526ff-17a2-445c-b949-5d4a129c7807-kube-api-access-hh77h" (OuterVolumeSpecName: "kube-api-access-hh77h") pod "f18526ff-17a2-445c-b949-5d4a129c7807" (UID: "f18526ff-17a2-445c-b949-5d4a129c7807"). InnerVolumeSpecName "kube-api-access-hh77h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.687016 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f18526ff-17a2-445c-b949-5d4a129c7807-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f18526ff-17a2-445c-b949-5d4a129c7807" (UID: "f18526ff-17a2-445c-b949-5d4a129c7807"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.693142 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f18526ff-17a2-445c-b949-5d4a129c7807-config-data" (OuterVolumeSpecName: "config-data") pod "f18526ff-17a2-445c-b949-5d4a129c7807" (UID: "f18526ff-17a2-445c-b949-5d4a129c7807"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.746347 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh77h\" (UniqueName: \"kubernetes.io/projected/f18526ff-17a2-445c-b949-5d4a129c7807-kube-api-access-hh77h\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.746383 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f18526ff-17a2-445c-b949-5d4a129c7807-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.746393 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f18526ff-17a2-445c-b949-5d4a129c7807-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.746402 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f18526ff-17a2-445c-b949-5d4a129c7807-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.776047 4742 generic.go:334] "Generic (PLEG): container finished" podID="f18526ff-17a2-445c-b949-5d4a129c7807" containerID="ab024a1a847ec4183e2d6f4123e82c16eba58bff1b1288427fb43a029682c7a6" exitCode=0 Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.776111 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f18526ff-17a2-445c-b949-5d4a129c7807","Type":"ContainerDied","Data":"ab024a1a847ec4183e2d6f4123e82c16eba58bff1b1288427fb43a029682c7a6"} Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.776111 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.776140 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f18526ff-17a2-445c-b949-5d4a129c7807","Type":"ContainerDied","Data":"569b6bf88e5d594984a827656d78d46ae1b1c6fcaf34d611618a612d2263877a"} Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.776157 4742 scope.go:117] "RemoveContainer" containerID="ab024a1a847ec4183e2d6f4123e82c16eba58bff1b1288427fb43a029682c7a6" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.779231 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1","Type":"ContainerStarted","Data":"4ea937241d7a788419b7213e521c75ca40ce867f018016c203e5770403ce4963"} Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.800652 4742 scope.go:117] "RemoveContainer" containerID="4bf89a006ab374cf59150d6248aa7aeb9f96c6cf25ed7c8c89f6f8ee8dedac9e" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.808271 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.817367 4742 scope.go:117] "RemoveContainer" containerID="ab024a1a847ec4183e2d6f4123e82c16eba58bff1b1288427fb43a029682c7a6" Mar 17 11:35:20 crc kubenswrapper[4742]: E0317 11:35:20.817799 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab024a1a847ec4183e2d6f4123e82c16eba58bff1b1288427fb43a029682c7a6\": container with ID starting with ab024a1a847ec4183e2d6f4123e82c16eba58bff1b1288427fb43a029682c7a6 not found: ID does not exist" containerID="ab024a1a847ec4183e2d6f4123e82c16eba58bff1b1288427fb43a029682c7a6" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.817826 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab024a1a847ec4183e2d6f4123e82c16eba58bff1b1288427fb43a029682c7a6"} err="failed to get container status \"ab024a1a847ec4183e2d6f4123e82c16eba58bff1b1288427fb43a029682c7a6\": rpc error: code = NotFound desc = could not find container \"ab024a1a847ec4183e2d6f4123e82c16eba58bff1b1288427fb43a029682c7a6\": container with ID starting with ab024a1a847ec4183e2d6f4123e82c16eba58bff1b1288427fb43a029682c7a6 not found: ID does not exist" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.817845 4742 scope.go:117] "RemoveContainer" containerID="4bf89a006ab374cf59150d6248aa7aeb9f96c6cf25ed7c8c89f6f8ee8dedac9e" Mar 17 11:35:20 crc kubenswrapper[4742]: E0317 11:35:20.818153 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bf89a006ab374cf59150d6248aa7aeb9f96c6cf25ed7c8c89f6f8ee8dedac9e\": container with ID starting with 4bf89a006ab374cf59150d6248aa7aeb9f96c6cf25ed7c8c89f6f8ee8dedac9e not found: ID does not exist" containerID="4bf89a006ab374cf59150d6248aa7aeb9f96c6cf25ed7c8c89f6f8ee8dedac9e" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.818188 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bf89a006ab374cf59150d6248aa7aeb9f96c6cf25ed7c8c89f6f8ee8dedac9e"} err="failed to get container status \"4bf89a006ab374cf59150d6248aa7aeb9f96c6cf25ed7c8c89f6f8ee8dedac9e\": rpc error: code = NotFound desc = could not find container \"4bf89a006ab374cf59150d6248aa7aeb9f96c6cf25ed7c8c89f6f8ee8dedac9e\": container with ID starting with 4bf89a006ab374cf59150d6248aa7aeb9f96c6cf25ed7c8c89f6f8ee8dedac9e not found: ID does not exist" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.819170 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.834175 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 17 11:35:20 crc kubenswrapper[4742]: E0317 11:35:20.834550 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f18526ff-17a2-445c-b949-5d4a129c7807" containerName="nova-api-api" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.834565 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f18526ff-17a2-445c-b949-5d4a129c7807" containerName="nova-api-api" Mar 17 11:35:20 crc kubenswrapper[4742]: E0317 11:35:20.834595 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f18526ff-17a2-445c-b949-5d4a129c7807" containerName="nova-api-log" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.834600 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f18526ff-17a2-445c-b949-5d4a129c7807" containerName="nova-api-log" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.834940 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f18526ff-17a2-445c-b949-5d4a129c7807" containerName="nova-api-api" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.834956 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f18526ff-17a2-445c-b949-5d4a129c7807" containerName="nova-api-log" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.835807 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.840649 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.840923 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.857242 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.860691 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.948880 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-config-data\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.948994 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jwdn\" (UniqueName: \"kubernetes.io/projected/e0032f44-c10f-40b5-9f46-d4e61972df14-kube-api-access-2jwdn\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.949804 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0032f44-c10f-40b5-9f46-d4e61972df14-logs\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.950352 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.950440 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:20 crc kubenswrapper[4742]: I0317 11:35:20.950458 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-public-tls-certs\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.052521 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.052614 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.052634 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-public-tls-certs\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.052704 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-config-data\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.052772 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jwdn\" (UniqueName: \"kubernetes.io/projected/e0032f44-c10f-40b5-9f46-d4e61972df14-kube-api-access-2jwdn\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.052827 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0032f44-c10f-40b5-9f46-d4e61972df14-logs\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.053327 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0032f44-c10f-40b5-9f46-d4e61972df14-logs\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.056020 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.056718 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-config-data\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.056853 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-public-tls-certs\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.058039 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.063464 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.075092 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jwdn\" (UniqueName: \"kubernetes.io/projected/e0032f44-c10f-40b5-9f46-d4e61972df14-kube-api-access-2jwdn\") pod \"nova-api-0\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.084377 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.193059 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.646238 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:35:21 crc kubenswrapper[4742]: W0317 11:35:21.655219 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0032f44_c10f_40b5_9f46_d4e61972df14.slice/crio-23b0ef8acfd32375ba0c67578a6a5b7ff40a366dba30225d7234e2dac1d8df03 WatchSource:0}: Error finding container 23b0ef8acfd32375ba0c67578a6a5b7ff40a366dba30225d7234e2dac1d8df03: Status 404 returned error can't find the container with id 23b0ef8acfd32375ba0c67578a6a5b7ff40a366dba30225d7234e2dac1d8df03 Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.791760 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1","Type":"ContainerStarted","Data":"2ee6acb28d9e7fc7b420bf7ad537f2cf0f14bf8b2b48df712b90873862e20bb7"} Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.795067 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0032f44-c10f-40b5-9f46-d4e61972df14","Type":"ContainerStarted","Data":"23b0ef8acfd32375ba0c67578a6a5b7ff40a366dba30225d7234e2dac1d8df03"} Mar 17 11:35:21 crc kubenswrapper[4742]: I0317 11:35:21.812691 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.015073 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-zt8wj"] Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.016478 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.019351 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.019553 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.031710 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zt8wj"] Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.075359 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zt8wj\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.075479 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swcxd\" (UniqueName: \"kubernetes.io/projected/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-kube-api-access-swcxd\") pod \"nova-cell1-cell-mapping-zt8wj\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.075602 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-scripts\") pod \"nova-cell1-cell-mapping-zt8wj\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.075626 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-config-data\") pod \"nova-cell1-cell-mapping-zt8wj\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.176999 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swcxd\" (UniqueName: \"kubernetes.io/projected/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-kube-api-access-swcxd\") pod \"nova-cell1-cell-mapping-zt8wj\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.177127 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-scripts\") pod \"nova-cell1-cell-mapping-zt8wj\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.177161 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-config-data\") pod \"nova-cell1-cell-mapping-zt8wj\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.177208 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zt8wj\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.181580 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-config-data\") pod \"nova-cell1-cell-mapping-zt8wj\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.181958 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-zt8wj\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.185422 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-scripts\") pod \"nova-cell1-cell-mapping-zt8wj\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.194663 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swcxd\" (UniqueName: \"kubernetes.io/projected/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-kube-api-access-swcxd\") pod \"nova-cell1-cell-mapping-zt8wj\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.387031 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.677854 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f18526ff-17a2-445c-b949-5d4a129c7807" path="/var/lib/kubelet/pods/f18526ff-17a2-445c-b949-5d4a129c7807/volumes" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.810288 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0032f44-c10f-40b5-9f46-d4e61972df14","Type":"ContainerStarted","Data":"fa6cb733ef3884e320cea26203b6b8787d65cee2e29770deaed5aba2116e8767"} Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.810653 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0032f44-c10f-40b5-9f46-d4e61972df14","Type":"ContainerStarted","Data":"b9f2b21d6dbf29c7d2ced36e179de5481f77bed73d7e2dd9e4e7cd9bf055b81e"} Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.841090 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.841065509 podStartE2EDuration="2.841065509s" podCreationTimestamp="2026-03-17 11:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:35:22.827565623 +0000 UTC m=+1425.953693381" watchObservedRunningTime="2026-03-17 11:35:22.841065509 +0000 UTC m=+1425.967193287" Mar 17 11:35:22 crc kubenswrapper[4742]: I0317 11:35:22.861458 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-zt8wj"] Mar 17 11:35:23 crc kubenswrapper[4742]: I0317 11:35:23.823056 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zt8wj" event={"ID":"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb","Type":"ContainerStarted","Data":"1bd6c3af4257784dba8c27c428d01b42ed4a63b5b3177825dd94022475d606c1"} Mar 17 11:35:23 crc kubenswrapper[4742]: I0317 11:35:23.824335 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zt8wj" event={"ID":"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb","Type":"ContainerStarted","Data":"8151dd6b678de04303f4ceeebea90c1acb51b2356df47798dda72b00e927914b"} Mar 17 11:35:23 crc kubenswrapper[4742]: I0317 11:35:23.829568 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="ceilometer-central-agent" containerID="cri-o://529f6308d569c883e9c969f479760566848fa03f627508d3e7662e7521abd424" gracePeriod=30 Mar 17 11:35:23 crc kubenswrapper[4742]: I0317 11:35:23.829853 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1","Type":"ContainerStarted","Data":"6f3313bd4009306047a7452ae7ced4585ba68d96c55f55fa4107935f19270e58"} Mar 17 11:35:23 crc kubenswrapper[4742]: I0317 11:35:23.829941 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 11:35:23 crc kubenswrapper[4742]: I0317 11:35:23.830014 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="proxy-httpd" containerID="cri-o://6f3313bd4009306047a7452ae7ced4585ba68d96c55f55fa4107935f19270e58" gracePeriod=30 Mar 17 11:35:23 crc kubenswrapper[4742]: I0317 11:35:23.830099 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="sg-core" containerID="cri-o://2ee6acb28d9e7fc7b420bf7ad537f2cf0f14bf8b2b48df712b90873862e20bb7" gracePeriod=30 Mar 17 11:35:23 crc kubenswrapper[4742]: I0317 11:35:23.830161 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="ceilometer-notification-agent" containerID="cri-o://4ea937241d7a788419b7213e521c75ca40ce867f018016c203e5770403ce4963" gracePeriod=30 Mar 17 11:35:23 crc kubenswrapper[4742]: I0317 11:35:23.844795 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-zt8wj" podStartSLOduration=2.844774928 podStartE2EDuration="2.844774928s" podCreationTimestamp="2026-03-17 11:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:35:23.839868582 +0000 UTC m=+1426.965996360" watchObservedRunningTime="2026-03-17 11:35:23.844774928 +0000 UTC m=+1426.970902686" Mar 17 11:35:23 crc kubenswrapper[4742]: I0317 11:35:23.870963 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.756841905 podStartE2EDuration="5.870932848s" podCreationTimestamp="2026-03-17 11:35:18 +0000 UTC" firstStartedPulling="2026-03-17 11:35:18.980221245 +0000 UTC m=+1422.106349003" lastFinishedPulling="2026-03-17 11:35:23.094312188 +0000 UTC m=+1426.220439946" observedRunningTime="2026-03-17 11:35:23.868429238 +0000 UTC m=+1426.994557006" watchObservedRunningTime="2026-03-17 11:35:23.870932848 +0000 UTC m=+1426.997060636" Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.251201 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.390043 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-rvllq"] Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.390924 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" podUID="215c4d89-a098-4983-8deb-44ba6bbfced4" containerName="dnsmasq-dns" containerID="cri-o://fc005c29098b135cc972ca2293ca45e898145aba5a0df8605de5f85a94bbacab" gracePeriod=10 Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.849181 4742 generic.go:334] "Generic (PLEG): container finished" podID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerID="6f3313bd4009306047a7452ae7ced4585ba68d96c55f55fa4107935f19270e58" exitCode=0 Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.849434 4742 generic.go:334] "Generic (PLEG): container finished" podID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerID="2ee6acb28d9e7fc7b420bf7ad537f2cf0f14bf8b2b48df712b90873862e20bb7" exitCode=2 Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.849443 4742 generic.go:334] "Generic (PLEG): container finished" podID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerID="4ea937241d7a788419b7213e521c75ca40ce867f018016c203e5770403ce4963" exitCode=0 Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.849450 4742 generic.go:334] "Generic (PLEG): container finished" podID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerID="529f6308d569c883e9c969f479760566848fa03f627508d3e7662e7521abd424" exitCode=0 Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.849522 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1","Type":"ContainerDied","Data":"6f3313bd4009306047a7452ae7ced4585ba68d96c55f55fa4107935f19270e58"} Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.849547 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1","Type":"ContainerDied","Data":"2ee6acb28d9e7fc7b420bf7ad537f2cf0f14bf8b2b48df712b90873862e20bb7"} Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.849556 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1","Type":"ContainerDied","Data":"4ea937241d7a788419b7213e521c75ca40ce867f018016c203e5770403ce4963"} Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.849564 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1","Type":"ContainerDied","Data":"529f6308d569c883e9c969f479760566848fa03f627508d3e7662e7521abd424"} Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.854373 4742 generic.go:334] "Generic (PLEG): container finished" podID="215c4d89-a098-4983-8deb-44ba6bbfced4" containerID="fc005c29098b135cc972ca2293ca45e898145aba5a0df8605de5f85a94bbacab" exitCode=0 Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.854517 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" event={"ID":"215c4d89-a098-4983-8deb-44ba6bbfced4","Type":"ContainerDied","Data":"fc005c29098b135cc972ca2293ca45e898145aba5a0df8605de5f85a94bbacab"} Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.869055 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:35:24 crc kubenswrapper[4742]: I0317 11:35:24.924433 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.050664 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-config-data\") pod \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.050772 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dldkk\" (UniqueName: \"kubernetes.io/projected/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-kube-api-access-dldkk\") pod \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.050800 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-run-httpd\") pod \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.050831 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-dns-swift-storage-0\") pod \"215c4d89-a098-4983-8deb-44ba6bbfced4\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.050917 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-combined-ca-bundle\") pod \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.050996 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-ovsdbserver-nb\") pod \"215c4d89-a098-4983-8deb-44ba6bbfced4\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.051022 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-dns-svc\") pod \"215c4d89-a098-4983-8deb-44ba6bbfced4\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.051054 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-ceilometer-tls-certs\") pod \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.051110 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-ovsdbserver-sb\") pod \"215c4d89-a098-4983-8deb-44ba6bbfced4\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.051145 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-log-httpd\") pod \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.051209 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-config\") pod \"215c4d89-a098-4983-8deb-44ba6bbfced4\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.051283 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-sg-core-conf-yaml\") pod \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.051329 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4m2k\" (UniqueName: \"kubernetes.io/projected/215c4d89-a098-4983-8deb-44ba6bbfced4-kube-api-access-l4m2k\") pod \"215c4d89-a098-4983-8deb-44ba6bbfced4\" (UID: \"215c4d89-a098-4983-8deb-44ba6bbfced4\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.051353 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-scripts\") pod \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\" (UID: \"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1\") " Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.053644 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" (UID: "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.058829 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" (UID: "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.060279 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-scripts" (OuterVolumeSpecName: "scripts") pod "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" (UID: "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.072582 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215c4d89-a098-4983-8deb-44ba6bbfced4-kube-api-access-l4m2k" (OuterVolumeSpecName: "kube-api-access-l4m2k") pod "215c4d89-a098-4983-8deb-44ba6bbfced4" (UID: "215c4d89-a098-4983-8deb-44ba6bbfced4"). InnerVolumeSpecName "kube-api-access-l4m2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.072750 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-kube-api-access-dldkk" (OuterVolumeSpecName: "kube-api-access-dldkk") pod "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" (UID: "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1"). InnerVolumeSpecName "kube-api-access-dldkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.091840 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" (UID: "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.114774 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "215c4d89-a098-4983-8deb-44ba6bbfced4" (UID: "215c4d89-a098-4983-8deb-44ba6bbfced4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.126892 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-config" (OuterVolumeSpecName: "config") pod "215c4d89-a098-4983-8deb-44ba6bbfced4" (UID: "215c4d89-a098-4983-8deb-44ba6bbfced4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.128514 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "215c4d89-a098-4983-8deb-44ba6bbfced4" (UID: "215c4d89-a098-4983-8deb-44ba6bbfced4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.132442 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "215c4d89-a098-4983-8deb-44ba6bbfced4" (UID: "215c4d89-a098-4983-8deb-44ba6bbfced4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.134890 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" (UID: "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.139344 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "215c4d89-a098-4983-8deb-44ba6bbfced4" (UID: "215c4d89-a098-4983-8deb-44ba6bbfced4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.161427 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.161460 4742 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.161474 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4m2k\" (UniqueName: \"kubernetes.io/projected/215c4d89-a098-4983-8deb-44ba6bbfced4-kube-api-access-l4m2k\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.161488 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.161500 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dldkk\" (UniqueName: \"kubernetes.io/projected/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-kube-api-access-dldkk\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.161511 4742 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.161524 4742 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.161536 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.161549 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.161560 4742 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.161575 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/215c4d89-a098-4983-8deb-44ba6bbfced4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.161587 4742 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.163225 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" (UID: "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.179293 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-config-data" (OuterVolumeSpecName: "config-data") pod "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" (UID: "d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.263226 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.263258 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.867953 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" event={"ID":"215c4d89-a098-4983-8deb-44ba6bbfced4","Type":"ContainerDied","Data":"97dcc4a46915249d07fc35133e926fdf0449379974b73565e08a640e421e1bcc"} Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.868327 4742 scope.go:117] "RemoveContainer" containerID="fc005c29098b135cc972ca2293ca45e898145aba5a0df8605de5f85a94bbacab" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.868466 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-rvllq" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.876199 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1","Type":"ContainerDied","Data":"ae8a2b14142fadd2b21171b313b2155ee6ee1b5ebfdd8fb088add351b7a6eb19"} Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.876318 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.925850 4742 scope.go:117] "RemoveContainer" containerID="42662be949242104888bd445725e1c3b63e5ca039880f167694c5fe6351582c8" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.939188 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-rvllq"] Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.953590 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-rvllq"] Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.967666 4742 scope.go:117] "RemoveContainer" containerID="6f3313bd4009306047a7452ae7ced4585ba68d96c55f55fa4107935f19270e58" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.967829 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.979108 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.991661 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:25 crc kubenswrapper[4742]: E0317 11:35:25.992168 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215c4d89-a098-4983-8deb-44ba6bbfced4" containerName="dnsmasq-dns" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.992182 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="215c4d89-a098-4983-8deb-44ba6bbfced4" containerName="dnsmasq-dns" Mar 17 11:35:25 crc kubenswrapper[4742]: E0317 11:35:25.992209 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="215c4d89-a098-4983-8deb-44ba6bbfced4" containerName="init" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.992216 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="215c4d89-a098-4983-8deb-44ba6bbfced4" containerName="init" Mar 17 11:35:25 crc kubenswrapper[4742]: E0317 11:35:25.992231 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="ceilometer-notification-agent" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.992248 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="ceilometer-notification-agent" Mar 17 11:35:25 crc kubenswrapper[4742]: E0317 11:35:25.992268 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="sg-core" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.992275 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="sg-core" Mar 17 11:35:25 crc kubenswrapper[4742]: E0317 11:35:25.992286 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="ceilometer-central-agent" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.992292 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="ceilometer-central-agent" Mar 17 11:35:25 crc kubenswrapper[4742]: E0317 11:35:25.992321 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="proxy-httpd" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.992329 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="proxy-httpd" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.992533 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="ceilometer-central-agent" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.992548 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="ceilometer-notification-agent" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.992562 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="sg-core" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.992573 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" containerName="proxy-httpd" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.992588 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="215c4d89-a098-4983-8deb-44ba6bbfced4" containerName="dnsmasq-dns" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.995554 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.998671 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.998862 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 17 11:35:25 crc kubenswrapper[4742]: I0317 11:35:25.999199 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.007577 4742 scope.go:117] "RemoveContainer" containerID="2ee6acb28d9e7fc7b420bf7ad537f2cf0f14bf8b2b48df712b90873862e20bb7" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.023058 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.029988 4742 scope.go:117] "RemoveContainer" containerID="4ea937241d7a788419b7213e521c75ca40ce867f018016c203e5770403ce4963" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.059642 4742 scope.go:117] "RemoveContainer" containerID="529f6308d569c883e9c969f479760566848fa03f627508d3e7662e7521abd424" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.082059 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-config-data\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.082096 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.082124 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l49bb\" (UniqueName: \"kubernetes.io/projected/ea32fef3-81ea-41cb-8641-3a43304683c6-kube-api-access-l49bb\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.082163 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.082180 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.082335 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-scripts\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.082439 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea32fef3-81ea-41cb-8641-3a43304683c6-run-httpd\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.082470 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea32fef3-81ea-41cb-8641-3a43304683c6-log-httpd\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.183082 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.183117 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.183177 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-scripts\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.183229 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea32fef3-81ea-41cb-8641-3a43304683c6-run-httpd\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.183249 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea32fef3-81ea-41cb-8641-3a43304683c6-log-httpd\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.183279 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-config-data\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.183295 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.183315 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l49bb\" (UniqueName: \"kubernetes.io/projected/ea32fef3-81ea-41cb-8641-3a43304683c6-kube-api-access-l49bb\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.184010 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea32fef3-81ea-41cb-8641-3a43304683c6-log-httpd\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.184317 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ea32fef3-81ea-41cb-8641-3a43304683c6-run-httpd\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.189145 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.189678 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-scripts\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.190049 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-config-data\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.190945 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.191295 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea32fef3-81ea-41cb-8641-3a43304683c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.201176 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l49bb\" (UniqueName: \"kubernetes.io/projected/ea32fef3-81ea-41cb-8641-3a43304683c6-kube-api-access-l49bb\") pod \"ceilometer-0\" (UID: \"ea32fef3-81ea-41cb-8641-3a43304683c6\") " pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.321988 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.675040 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215c4d89-a098-4983-8deb-44ba6bbfced4" path="/var/lib/kubelet/pods/215c4d89-a098-4983-8deb-44ba6bbfced4/volumes" Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.676173 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1" path="/var/lib/kubelet/pods/d3fd15f3-6f78-48a3-9ecb-8ff726aa8ba1/volumes" Mar 17 11:35:26 crc kubenswrapper[4742]: W0317 11:35:26.887806 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea32fef3_81ea_41cb_8641_3a43304683c6.slice/crio-201b3321277ed433d76f2ad0e439adae29d718e0a5c531f88b8381e8c5cb0b63 WatchSource:0}: Error finding container 201b3321277ed433d76f2ad0e439adae29d718e0a5c531f88b8381e8c5cb0b63: Status 404 returned error can't find the container with id 201b3321277ed433d76f2ad0e439adae29d718e0a5c531f88b8381e8c5cb0b63 Mar 17 11:35:26 crc kubenswrapper[4742]: I0317 11:35:26.899538 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 17 11:35:27 crc kubenswrapper[4742]: I0317 11:35:27.903162 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea32fef3-81ea-41cb-8641-3a43304683c6","Type":"ContainerStarted","Data":"201b3321277ed433d76f2ad0e439adae29d718e0a5c531f88b8381e8c5cb0b63"} Mar 17 11:35:28 crc kubenswrapper[4742]: I0317 11:35:28.948235 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea32fef3-81ea-41cb-8641-3a43304683c6","Type":"ContainerStarted","Data":"d98aac431821fdd5f68b750010eca2150420ae4f9f360dfd9b3f57d94a3acf3b"} Mar 17 11:35:28 crc kubenswrapper[4742]: I0317 11:35:28.948626 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea32fef3-81ea-41cb-8641-3a43304683c6","Type":"ContainerStarted","Data":"0c5c87eb96cf76797dc65c13302a08080966a51c76f30c5c02a4324ed8f1f90f"} Mar 17 11:35:28 crc kubenswrapper[4742]: I0317 11:35:28.950493 4742 generic.go:334] "Generic (PLEG): container finished" podID="17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb" containerID="1bd6c3af4257784dba8c27c428d01b42ed4a63b5b3177825dd94022475d606c1" exitCode=0 Mar 17 11:35:28 crc kubenswrapper[4742]: I0317 11:35:28.950542 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zt8wj" event={"ID":"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb","Type":"ContainerDied","Data":"1bd6c3af4257784dba8c27c428d01b42ed4a63b5b3177825dd94022475d606c1"} Mar 17 11:35:29 crc kubenswrapper[4742]: I0317 11:35:29.967333 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea32fef3-81ea-41cb-8641-3a43304683c6","Type":"ContainerStarted","Data":"34d71ceaa5cc2fc9c9e6e78a190078a983f7bb770e5f5b1fa4d20c013d9eebc0"} Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.418934 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.583744 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swcxd\" (UniqueName: \"kubernetes.io/projected/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-kube-api-access-swcxd\") pod \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.583808 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-config-data\") pod \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.583868 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-combined-ca-bundle\") pod \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.584038 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-scripts\") pod \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\" (UID: \"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb\") " Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.589921 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-scripts" (OuterVolumeSpecName: "scripts") pod "17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb" (UID: "17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.597614 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-kube-api-access-swcxd" (OuterVolumeSpecName: "kube-api-access-swcxd") pod "17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb" (UID: "17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb"). InnerVolumeSpecName "kube-api-access-swcxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.610173 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-config-data" (OuterVolumeSpecName: "config-data") pod "17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb" (UID: "17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.625452 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb" (UID: "17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.687016 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swcxd\" (UniqueName: \"kubernetes.io/projected/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-kube-api-access-swcxd\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.687066 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.687085 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.687100 4742 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb-scripts\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.988209 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-zt8wj" event={"ID":"17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb","Type":"ContainerDied","Data":"8151dd6b678de04303f4ceeebea90c1acb51b2356df47798dda72b00e927914b"} Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.988698 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8151dd6b678de04303f4ceeebea90c1acb51b2356df47798dda72b00e927914b" Mar 17 11:35:30 crc kubenswrapper[4742]: I0317 11:35:30.988403 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-zt8wj" Mar 17 11:35:31 crc kubenswrapper[4742]: I0317 11:35:31.193324 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:35:31 crc kubenswrapper[4742]: I0317 11:35:31.193387 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 11:35:31 crc kubenswrapper[4742]: I0317 11:35:31.193550 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 11:35:31 crc kubenswrapper[4742]: I0317 11:35:31.193669 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e0032f44-c10f-40b5-9f46-d4e61972df14" containerName="nova-api-log" containerID="cri-o://b9f2b21d6dbf29c7d2ced36e179de5481f77bed73d7e2dd9e4e7cd9bf055b81e" gracePeriod=30 Mar 17 11:35:31 crc kubenswrapper[4742]: I0317 11:35:31.193740 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e0032f44-c10f-40b5-9f46-d4e61972df14" containerName="nova-api-api" containerID="cri-o://fa6cb733ef3884e320cea26203b6b8787d65cee2e29770deaed5aba2116e8767" gracePeriod=30 Mar 17 11:35:31 crc kubenswrapper[4742]: I0317 11:35:31.211438 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:35:31 crc kubenswrapper[4742]: I0317 11:35:31.211710 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cbcfc054-1dcd-4a3e-a79d-7574f434b972" containerName="nova-scheduler-scheduler" containerID="cri-o://9513ee6d2312f082bac94466635f07cb10003f04774ca6525f2d14c10caebe35" gracePeriod=30 Mar 17 11:35:31 crc kubenswrapper[4742]: I0317 11:35:31.212482 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e0032f44-c10f-40b5-9f46-d4e61972df14" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": EOF" Mar 17 11:35:31 crc kubenswrapper[4742]: I0317 11:35:31.212565 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e0032f44-c10f-40b5-9f46-d4e61972df14" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": EOF" Mar 17 11:35:31 crc kubenswrapper[4742]: I0317 11:35:31.234297 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:35:31 crc kubenswrapper[4742]: I0317 11:35:31.234506 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f3d29251-108b-4705-8e84-36f40549b65c" containerName="nova-metadata-log" containerID="cri-o://fb473ad5e8e64e4c8a9277822c3a05864f6e39fa05d7f5e41705574a7d40dc84" gracePeriod=30 Mar 17 11:35:31 crc kubenswrapper[4742]: I0317 11:35:31.234621 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f3d29251-108b-4705-8e84-36f40549b65c" containerName="nova-metadata-metadata" containerID="cri-o://172696b90d932e902a62e2470cac16687684f759a9723f95ea5cf6bcbd54800b" gracePeriod=30 Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.001848 4742 generic.go:334] "Generic (PLEG): container finished" podID="cbcfc054-1dcd-4a3e-a79d-7574f434b972" containerID="9513ee6d2312f082bac94466635f07cb10003f04774ca6525f2d14c10caebe35" exitCode=0 Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.002292 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cbcfc054-1dcd-4a3e-a79d-7574f434b972","Type":"ContainerDied","Data":"9513ee6d2312f082bac94466635f07cb10003f04774ca6525f2d14c10caebe35"} Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.005986 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ea32fef3-81ea-41cb-8641-3a43304683c6","Type":"ContainerStarted","Data":"bc73ead633ea4bce76986f7e8c11f8d580fb3ca0c60353f9438362e202d857de"} Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.006285 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.011865 4742 generic.go:334] "Generic (PLEG): container finished" podID="e0032f44-c10f-40b5-9f46-d4e61972df14" containerID="b9f2b21d6dbf29c7d2ced36e179de5481f77bed73d7e2dd9e4e7cd9bf055b81e" exitCode=143 Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.012018 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0032f44-c10f-40b5-9f46-d4e61972df14","Type":"ContainerDied","Data":"b9f2b21d6dbf29c7d2ced36e179de5481f77bed73d7e2dd9e4e7cd9bf055b81e"} Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.031286 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3d29251-108b-4705-8e84-36f40549b65c","Type":"ContainerDied","Data":"fb473ad5e8e64e4c8a9277822c3a05864f6e39fa05d7f5e41705574a7d40dc84"} Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.031248 4742 generic.go:334] "Generic (PLEG): container finished" podID="f3d29251-108b-4705-8e84-36f40549b65c" containerID="fb473ad5e8e64e4c8a9277822c3a05864f6e39fa05d7f5e41705574a7d40dc84" exitCode=143 Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.039958 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.962168703 podStartE2EDuration="7.039938874s" podCreationTimestamp="2026-03-17 11:35:25 +0000 UTC" firstStartedPulling="2026-03-17 11:35:26.890539091 +0000 UTC m=+1430.016666869" lastFinishedPulling="2026-03-17 11:35:30.968309252 +0000 UTC m=+1434.094437040" observedRunningTime="2026-03-17 11:35:32.025272186 +0000 UTC m=+1435.151399944" watchObservedRunningTime="2026-03-17 11:35:32.039938874 +0000 UTC m=+1435.166066632" Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.140794 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.318127 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbcfc054-1dcd-4a3e-a79d-7574f434b972-config-data\") pod \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\" (UID: \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\") " Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.319055 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9ttb\" (UniqueName: \"kubernetes.io/projected/cbcfc054-1dcd-4a3e-a79d-7574f434b972-kube-api-access-z9ttb\") pod \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\" (UID: \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\") " Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.319279 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcfc054-1dcd-4a3e-a79d-7574f434b972-combined-ca-bundle\") pod \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\" (UID: \"cbcfc054-1dcd-4a3e-a79d-7574f434b972\") " Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.325752 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbcfc054-1dcd-4a3e-a79d-7574f434b972-kube-api-access-z9ttb" (OuterVolumeSpecName: "kube-api-access-z9ttb") pod "cbcfc054-1dcd-4a3e-a79d-7574f434b972" (UID: "cbcfc054-1dcd-4a3e-a79d-7574f434b972"). InnerVolumeSpecName "kube-api-access-z9ttb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.357089 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbcfc054-1dcd-4a3e-a79d-7574f434b972-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbcfc054-1dcd-4a3e-a79d-7574f434b972" (UID: "cbcfc054-1dcd-4a3e-a79d-7574f434b972"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.375230 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbcfc054-1dcd-4a3e-a79d-7574f434b972-config-data" (OuterVolumeSpecName: "config-data") pod "cbcfc054-1dcd-4a3e-a79d-7574f434b972" (UID: "cbcfc054-1dcd-4a3e-a79d-7574f434b972"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.421723 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9ttb\" (UniqueName: \"kubernetes.io/projected/cbcfc054-1dcd-4a3e-a79d-7574f434b972-kube-api-access-z9ttb\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.421837 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbcfc054-1dcd-4a3e-a79d-7574f434b972-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:32 crc kubenswrapper[4742]: I0317 11:35:32.421895 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbcfc054-1dcd-4a3e-a79d-7574f434b972-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.041855 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cbcfc054-1dcd-4a3e-a79d-7574f434b972","Type":"ContainerDied","Data":"e11efd7380a8bfce30f44668e874d3fa51a47e3c6aa0dc9715fa6ffc79bccec0"} Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.041877 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.041943 4742 scope.go:117] "RemoveContainer" containerID="9513ee6d2312f082bac94466635f07cb10003f04774ca6525f2d14c10caebe35" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.082992 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.096971 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.117987 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:35:33 crc kubenswrapper[4742]: E0317 11:35:33.118386 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbcfc054-1dcd-4a3e-a79d-7574f434b972" containerName="nova-scheduler-scheduler" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.118402 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbcfc054-1dcd-4a3e-a79d-7574f434b972" containerName="nova-scheduler-scheduler" Mar 17 11:35:33 crc kubenswrapper[4742]: E0317 11:35:33.118415 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb" containerName="nova-manage" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.118421 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb" containerName="nova-manage" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.118584 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbcfc054-1dcd-4a3e-a79d-7574f434b972" containerName="nova-scheduler-scheduler" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.118606 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb" containerName="nova-manage" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.119168 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.125052 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.128409 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.253507 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8591a2-6548-4bcb-8be3-71e549605bd2-config-data\") pod \"nova-scheduler-0\" (UID: \"5c8591a2-6548-4bcb-8be3-71e549605bd2\") " pod="openstack/nova-scheduler-0" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.253768 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw4fc\" (UniqueName: \"kubernetes.io/projected/5c8591a2-6548-4bcb-8be3-71e549605bd2-kube-api-access-lw4fc\") pod \"nova-scheduler-0\" (UID: \"5c8591a2-6548-4bcb-8be3-71e549605bd2\") " pod="openstack/nova-scheduler-0" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.253964 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8591a2-6548-4bcb-8be3-71e549605bd2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c8591a2-6548-4bcb-8be3-71e549605bd2\") " pod="openstack/nova-scheduler-0" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.356762 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4fc\" (UniqueName: \"kubernetes.io/projected/5c8591a2-6548-4bcb-8be3-71e549605bd2-kube-api-access-lw4fc\") pod \"nova-scheduler-0\" (UID: \"5c8591a2-6548-4bcb-8be3-71e549605bd2\") " pod="openstack/nova-scheduler-0" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.357127 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8591a2-6548-4bcb-8be3-71e549605bd2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c8591a2-6548-4bcb-8be3-71e549605bd2\") " pod="openstack/nova-scheduler-0" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.357199 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8591a2-6548-4bcb-8be3-71e549605bd2-config-data\") pod \"nova-scheduler-0\" (UID: \"5c8591a2-6548-4bcb-8be3-71e549605bd2\") " pod="openstack/nova-scheduler-0" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.362171 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8591a2-6548-4bcb-8be3-71e549605bd2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c8591a2-6548-4bcb-8be3-71e549605bd2\") " pod="openstack/nova-scheduler-0" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.364144 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8591a2-6548-4bcb-8be3-71e549605bd2-config-data\") pod \"nova-scheduler-0\" (UID: \"5c8591a2-6548-4bcb-8be3-71e549605bd2\") " pod="openstack/nova-scheduler-0" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.375040 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw4fc\" (UniqueName: \"kubernetes.io/projected/5c8591a2-6548-4bcb-8be3-71e549605bd2-kube-api-access-lw4fc\") pod \"nova-scheduler-0\" (UID: \"5c8591a2-6548-4bcb-8be3-71e549605bd2\") " pod="openstack/nova-scheduler-0" Mar 17 11:35:33 crc kubenswrapper[4742]: I0317 11:35:33.505157 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 17 11:35:34 crc kubenswrapper[4742]: I0317 11:35:34.015364 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 17 11:35:34 crc kubenswrapper[4742]: I0317 11:35:34.054657 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c8591a2-6548-4bcb-8be3-71e549605bd2","Type":"ContainerStarted","Data":"80b6402de295a5f6f8ac7896880f3b7f98a7de77f3f4268adcc1cb7801ce5b1a"} Mar 17 11:35:34 crc kubenswrapper[4742]: I0317 11:35:34.677420 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbcfc054-1dcd-4a3e-a79d-7574f434b972" path="/var/lib/kubelet/pods/cbcfc054-1dcd-4a3e-a79d-7574f434b972/volumes" Mar 17 11:35:34 crc kubenswrapper[4742]: I0317 11:35:34.830273 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.003470 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3d29251-108b-4705-8e84-36f40549b65c-logs\") pod \"f3d29251-108b-4705-8e84-36f40549b65c\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.003618 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-nova-metadata-tls-certs\") pod \"f3d29251-108b-4705-8e84-36f40549b65c\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.003659 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-config-data\") pod \"f3d29251-108b-4705-8e84-36f40549b65c\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.003687 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-combined-ca-bundle\") pod \"f3d29251-108b-4705-8e84-36f40549b65c\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.003721 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdmwj\" (UniqueName: \"kubernetes.io/projected/f3d29251-108b-4705-8e84-36f40549b65c-kube-api-access-rdmwj\") pod \"f3d29251-108b-4705-8e84-36f40549b65c\" (UID: \"f3d29251-108b-4705-8e84-36f40549b65c\") " Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.004303 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3d29251-108b-4705-8e84-36f40549b65c-logs" (OuterVolumeSpecName: "logs") pod "f3d29251-108b-4705-8e84-36f40549b65c" (UID: "f3d29251-108b-4705-8e84-36f40549b65c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.010034 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3d29251-108b-4705-8e84-36f40549b65c-kube-api-access-rdmwj" (OuterVolumeSpecName: "kube-api-access-rdmwj") pod "f3d29251-108b-4705-8e84-36f40549b65c" (UID: "f3d29251-108b-4705-8e84-36f40549b65c"). InnerVolumeSpecName "kube-api-access-rdmwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.047796 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3d29251-108b-4705-8e84-36f40549b65c" (UID: "f3d29251-108b-4705-8e84-36f40549b65c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.048251 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-config-data" (OuterVolumeSpecName: "config-data") pod "f3d29251-108b-4705-8e84-36f40549b65c" (UID: "f3d29251-108b-4705-8e84-36f40549b65c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.057991 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f3d29251-108b-4705-8e84-36f40549b65c" (UID: "f3d29251-108b-4705-8e84-36f40549b65c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.069344 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c8591a2-6548-4bcb-8be3-71e549605bd2","Type":"ContainerStarted","Data":"13b0530319669355954b1baa856dca9b2fc72cbac2ea4cee8a78007ac2c811d2"} Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.071402 4742 generic.go:334] "Generic (PLEG): container finished" podID="f3d29251-108b-4705-8e84-36f40549b65c" containerID="172696b90d932e902a62e2470cac16687684f759a9723f95ea5cf6bcbd54800b" exitCode=0 Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.071433 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3d29251-108b-4705-8e84-36f40549b65c","Type":"ContainerDied","Data":"172696b90d932e902a62e2470cac16687684f759a9723f95ea5cf6bcbd54800b"} Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.071453 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3d29251-108b-4705-8e84-36f40549b65c","Type":"ContainerDied","Data":"335c6319b37f14bb22053f6caa256db30c9f4a418b89f2273c45507bd9c3c2a5"} Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.071468 4742 scope.go:117] "RemoveContainer" containerID="172696b90d932e902a62e2470cac16687684f759a9723f95ea5cf6bcbd54800b" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.071512 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.104312 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.104295826 podStartE2EDuration="2.104295826s" podCreationTimestamp="2026-03-17 11:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:35:35.095465009 +0000 UTC m=+1438.221592767" watchObservedRunningTime="2026-03-17 11:35:35.104295826 +0000 UTC m=+1438.230423584" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.104510 4742 scope.go:117] "RemoveContainer" containerID="fb473ad5e8e64e4c8a9277822c3a05864f6e39fa05d7f5e41705574a7d40dc84" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.105478 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3d29251-108b-4705-8e84-36f40549b65c-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.105508 4742 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.105518 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.105526 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d29251-108b-4705-8e84-36f40549b65c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.105535 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdmwj\" (UniqueName: \"kubernetes.io/projected/f3d29251-108b-4705-8e84-36f40549b65c-kube-api-access-rdmwj\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.128867 4742 scope.go:117] "RemoveContainer" containerID="172696b90d932e902a62e2470cac16687684f759a9723f95ea5cf6bcbd54800b" Mar 17 11:35:35 crc kubenswrapper[4742]: E0317 11:35:35.129481 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"172696b90d932e902a62e2470cac16687684f759a9723f95ea5cf6bcbd54800b\": container with ID starting with 172696b90d932e902a62e2470cac16687684f759a9723f95ea5cf6bcbd54800b not found: ID does not exist" containerID="172696b90d932e902a62e2470cac16687684f759a9723f95ea5cf6bcbd54800b" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.129520 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"172696b90d932e902a62e2470cac16687684f759a9723f95ea5cf6bcbd54800b"} err="failed to get container status \"172696b90d932e902a62e2470cac16687684f759a9723f95ea5cf6bcbd54800b\": rpc error: code = NotFound desc = could not find container \"172696b90d932e902a62e2470cac16687684f759a9723f95ea5cf6bcbd54800b\": container with ID starting with 172696b90d932e902a62e2470cac16687684f759a9723f95ea5cf6bcbd54800b not found: ID does not exist" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.129545 4742 scope.go:117] "RemoveContainer" containerID="fb473ad5e8e64e4c8a9277822c3a05864f6e39fa05d7f5e41705574a7d40dc84" Mar 17 11:35:35 crc kubenswrapper[4742]: E0317 11:35:35.129986 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb473ad5e8e64e4c8a9277822c3a05864f6e39fa05d7f5e41705574a7d40dc84\": container with ID starting with fb473ad5e8e64e4c8a9277822c3a05864f6e39fa05d7f5e41705574a7d40dc84 not found: ID does not exist" containerID="fb473ad5e8e64e4c8a9277822c3a05864f6e39fa05d7f5e41705574a7d40dc84" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.130044 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb473ad5e8e64e4c8a9277822c3a05864f6e39fa05d7f5e41705574a7d40dc84"} err="failed to get container status \"fb473ad5e8e64e4c8a9277822c3a05864f6e39fa05d7f5e41705574a7d40dc84\": rpc error: code = NotFound desc = could not find container \"fb473ad5e8e64e4c8a9277822c3a05864f6e39fa05d7f5e41705574a7d40dc84\": container with ID starting with fb473ad5e8e64e4c8a9277822c3a05864f6e39fa05d7f5e41705574a7d40dc84 not found: ID does not exist" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.138615 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.154436 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.168812 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:35:35 crc kubenswrapper[4742]: E0317 11:35:35.169211 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d29251-108b-4705-8e84-36f40549b65c" containerName="nova-metadata-log" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.169226 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d29251-108b-4705-8e84-36f40549b65c" containerName="nova-metadata-log" Mar 17 11:35:35 crc kubenswrapper[4742]: E0317 11:35:35.169258 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d29251-108b-4705-8e84-36f40549b65c" containerName="nova-metadata-metadata" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.169264 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d29251-108b-4705-8e84-36f40549b65c" containerName="nova-metadata-metadata" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.169431 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d29251-108b-4705-8e84-36f40549b65c" containerName="nova-metadata-log" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.169448 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d29251-108b-4705-8e84-36f40549b65c" containerName="nova-metadata-metadata" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.170335 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.175360 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.177441 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.184969 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.309188 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f6a1398-04d6-4668-9689-17bdbb214850-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.309271 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvl2s\" (UniqueName: \"kubernetes.io/projected/8f6a1398-04d6-4668-9689-17bdbb214850-kube-api-access-kvl2s\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.309299 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6a1398-04d6-4668-9689-17bdbb214850-config-data\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.309376 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6a1398-04d6-4668-9689-17bdbb214850-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.309441 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f6a1398-04d6-4668-9689-17bdbb214850-logs\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.410661 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6a1398-04d6-4668-9689-17bdbb214850-config-data\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.410736 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6a1398-04d6-4668-9689-17bdbb214850-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.410783 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f6a1398-04d6-4668-9689-17bdbb214850-logs\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.410849 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f6a1398-04d6-4668-9689-17bdbb214850-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.410890 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvl2s\" (UniqueName: \"kubernetes.io/projected/8f6a1398-04d6-4668-9689-17bdbb214850-kube-api-access-kvl2s\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.411786 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f6a1398-04d6-4668-9689-17bdbb214850-logs\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.415860 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f6a1398-04d6-4668-9689-17bdbb214850-config-data\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.415872 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f6a1398-04d6-4668-9689-17bdbb214850-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.416794 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f6a1398-04d6-4668-9689-17bdbb214850-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.430521 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvl2s\" (UniqueName: \"kubernetes.io/projected/8f6a1398-04d6-4668-9689-17bdbb214850-kube-api-access-kvl2s\") pod \"nova-metadata-0\" (UID: \"8f6a1398-04d6-4668-9689-17bdbb214850\") " pod="openstack/nova-metadata-0" Mar 17 11:35:35 crc kubenswrapper[4742]: I0317 11:35:35.486975 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 17 11:35:36 crc kubenswrapper[4742]: I0317 11:35:36.031577 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 17 11:35:36 crc kubenswrapper[4742]: W0317 11:35:36.035642 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f6a1398_04d6_4668_9689_17bdbb214850.slice/crio-1ae194ae39e4c0f1fe994f4a07b37c8c9a29946953c248df6c7d47e492d5b5c6 WatchSource:0}: Error finding container 1ae194ae39e4c0f1fe994f4a07b37c8c9a29946953c248df6c7d47e492d5b5c6: Status 404 returned error can't find the container with id 1ae194ae39e4c0f1fe994f4a07b37c8c9a29946953c248df6c7d47e492d5b5c6 Mar 17 11:35:36 crc kubenswrapper[4742]: I0317 11:35:36.090489 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f6a1398-04d6-4668-9689-17bdbb214850","Type":"ContainerStarted","Data":"1ae194ae39e4c0f1fe994f4a07b37c8c9a29946953c248df6c7d47e492d5b5c6"} Mar 17 11:35:36 crc kubenswrapper[4742]: I0317 11:35:36.683678 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3d29251-108b-4705-8e84-36f40549b65c" path="/var/lib/kubelet/pods/f3d29251-108b-4705-8e84-36f40549b65c/volumes" Mar 17 11:35:37 crc kubenswrapper[4742]: I0317 11:35:37.105851 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f6a1398-04d6-4668-9689-17bdbb214850","Type":"ContainerStarted","Data":"636e75d8d4738d6e73f1dd7a2932b2ad9d50dc489cb312f5f5be1f08930885d6"} Mar 17 11:35:37 crc kubenswrapper[4742]: I0317 11:35:37.105941 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f6a1398-04d6-4668-9689-17bdbb214850","Type":"ContainerStarted","Data":"314b2df0e1823df51c2951ac449cc11255952b5f3366ba1d8d676fc88d067536"} Mar 17 11:35:37 crc kubenswrapper[4742]: I0317 11:35:37.135640 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.135621711 podStartE2EDuration="2.135621711s" podCreationTimestamp="2026-03-17 11:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:35:37.132278037 +0000 UTC m=+1440.258405795" watchObservedRunningTime="2026-03-17 11:35:37.135621711 +0000 UTC m=+1440.261749469" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.120760 4742 generic.go:334] "Generic (PLEG): container finished" podID="e0032f44-c10f-40b5-9f46-d4e61972df14" containerID="fa6cb733ef3884e320cea26203b6b8787d65cee2e29770deaed5aba2116e8767" exitCode=0 Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.122040 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0032f44-c10f-40b5-9f46-d4e61972df14","Type":"ContainerDied","Data":"fa6cb733ef3884e320cea26203b6b8787d65cee2e29770deaed5aba2116e8767"} Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.122098 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e0032f44-c10f-40b5-9f46-d4e61972df14","Type":"ContainerDied","Data":"23b0ef8acfd32375ba0c67578a6a5b7ff40a366dba30225d7234e2dac1d8df03"} Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.122111 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23b0ef8acfd32375ba0c67578a6a5b7ff40a366dba30225d7234e2dac1d8df03" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.175795 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.275204 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-config-data\") pod \"e0032f44-c10f-40b5-9f46-d4e61972df14\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.275366 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-internal-tls-certs\") pod \"e0032f44-c10f-40b5-9f46-d4e61972df14\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.275542 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0032f44-c10f-40b5-9f46-d4e61972df14-logs\") pod \"e0032f44-c10f-40b5-9f46-d4e61972df14\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.275603 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-public-tls-certs\") pod \"e0032f44-c10f-40b5-9f46-d4e61972df14\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.275752 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jwdn\" (UniqueName: \"kubernetes.io/projected/e0032f44-c10f-40b5-9f46-d4e61972df14-kube-api-access-2jwdn\") pod \"e0032f44-c10f-40b5-9f46-d4e61972df14\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.275818 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-combined-ca-bundle\") pod \"e0032f44-c10f-40b5-9f46-d4e61972df14\" (UID: \"e0032f44-c10f-40b5-9f46-d4e61972df14\") " Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.276209 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0032f44-c10f-40b5-9f46-d4e61972df14-logs" (OuterVolumeSpecName: "logs") pod "e0032f44-c10f-40b5-9f46-d4e61972df14" (UID: "e0032f44-c10f-40b5-9f46-d4e61972df14"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.276781 4742 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0032f44-c10f-40b5-9f46-d4e61972df14-logs\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.283127 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0032f44-c10f-40b5-9f46-d4e61972df14-kube-api-access-2jwdn" (OuterVolumeSpecName: "kube-api-access-2jwdn") pod "e0032f44-c10f-40b5-9f46-d4e61972df14" (UID: "e0032f44-c10f-40b5-9f46-d4e61972df14"). InnerVolumeSpecName "kube-api-access-2jwdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.302934 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-config-data" (OuterVolumeSpecName: "config-data") pod "e0032f44-c10f-40b5-9f46-d4e61972df14" (UID: "e0032f44-c10f-40b5-9f46-d4e61972df14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.319366 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0032f44-c10f-40b5-9f46-d4e61972df14" (UID: "e0032f44-c10f-40b5-9f46-d4e61972df14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.341646 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e0032f44-c10f-40b5-9f46-d4e61972df14" (UID: "e0032f44-c10f-40b5-9f46-d4e61972df14"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.357023 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e0032f44-c10f-40b5-9f46-d4e61972df14" (UID: "e0032f44-c10f-40b5-9f46-d4e61972df14"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.378119 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jwdn\" (UniqueName: \"kubernetes.io/projected/e0032f44-c10f-40b5-9f46-d4e61972df14-kube-api-access-2jwdn\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.378150 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.378159 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.378168 4742 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.378178 4742 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0032f44-c10f-40b5-9f46-d4e61972df14-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 17 11:35:38 crc kubenswrapper[4742]: I0317 11:35:38.505599 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.132367 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.161550 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.173726 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.195018 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 17 11:35:39 crc kubenswrapper[4742]: E0317 11:35:39.195772 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0032f44-c10f-40b5-9f46-d4e61972df14" containerName="nova-api-log" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.195807 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0032f44-c10f-40b5-9f46-d4e61972df14" containerName="nova-api-log" Mar 17 11:35:39 crc kubenswrapper[4742]: E0317 11:35:39.195954 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0032f44-c10f-40b5-9f46-d4e61972df14" containerName="nova-api-api" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.195977 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0032f44-c10f-40b5-9f46-d4e61972df14" containerName="nova-api-api" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.196249 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0032f44-c10f-40b5-9f46-d4e61972df14" containerName="nova-api-log" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.196291 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0032f44-c10f-40b5-9f46-d4e61972df14" containerName="nova-api-api" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.197573 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.200401 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.200699 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.213202 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.219361 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.294669 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b889c9-de23-4357-956c-1684e42c64de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.295017 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b889c9-de23-4357-956c-1684e42c64de-public-tls-certs\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.295262 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2b889c9-de23-4357-956c-1684e42c64de-config-data\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.295313 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2b889c9-de23-4357-956c-1684e42c64de-logs\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.295343 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvhbv\" (UniqueName: \"kubernetes.io/projected/f2b889c9-de23-4357-956c-1684e42c64de-kube-api-access-vvhbv\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.295469 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b889c9-de23-4357-956c-1684e42c64de-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.397730 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b889c9-de23-4357-956c-1684e42c64de-public-tls-certs\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.397878 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2b889c9-de23-4357-956c-1684e42c64de-config-data\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.397998 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2b889c9-de23-4357-956c-1684e42c64de-logs\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.398045 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvhbv\" (UniqueName: \"kubernetes.io/projected/f2b889c9-de23-4357-956c-1684e42c64de-kube-api-access-vvhbv\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.398118 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b889c9-de23-4357-956c-1684e42c64de-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.398290 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b889c9-de23-4357-956c-1684e42c64de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.398571 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2b889c9-de23-4357-956c-1684e42c64de-logs\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.404008 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b889c9-de23-4357-956c-1684e42c64de-public-tls-certs\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.404386 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b889c9-de23-4357-956c-1684e42c64de-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.404416 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2b889c9-de23-4357-956c-1684e42c64de-config-data\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.404803 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2b889c9-de23-4357-956c-1684e42c64de-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.420883 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvhbv\" (UniqueName: \"kubernetes.io/projected/f2b889c9-de23-4357-956c-1684e42c64de-kube-api-access-vvhbv\") pod \"nova-api-0\" (UID: \"f2b889c9-de23-4357-956c-1684e42c64de\") " pod="openstack/nova-api-0" Mar 17 11:35:39 crc kubenswrapper[4742]: I0317 11:35:39.559514 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 17 11:35:40 crc kubenswrapper[4742]: I0317 11:35:40.040538 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 17 11:35:40 crc kubenswrapper[4742]: W0317 11:35:40.042611 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2b889c9_de23_4357_956c_1684e42c64de.slice/crio-6efe7ff99c5d80b4f8b5481b359448f77764addd867f3a56ff486b4559400af6 WatchSource:0}: Error finding container 6efe7ff99c5d80b4f8b5481b359448f77764addd867f3a56ff486b4559400af6: Status 404 returned error can't find the container with id 6efe7ff99c5d80b4f8b5481b359448f77764addd867f3a56ff486b4559400af6 Mar 17 11:35:40 crc kubenswrapper[4742]: I0317 11:35:40.144184 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2b889c9-de23-4357-956c-1684e42c64de","Type":"ContainerStarted","Data":"6efe7ff99c5d80b4f8b5481b359448f77764addd867f3a56ff486b4559400af6"} Mar 17 11:35:40 crc kubenswrapper[4742]: I0317 11:35:40.675026 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0032f44-c10f-40b5-9f46-d4e61972df14" path="/var/lib/kubelet/pods/e0032f44-c10f-40b5-9f46-d4e61972df14/volumes" Mar 17 11:35:41 crc kubenswrapper[4742]: I0317 11:35:41.162383 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2b889c9-de23-4357-956c-1684e42c64de","Type":"ContainerStarted","Data":"e0137caf4c10aaf3e9a9bf5d3a0cb8141b7e3b6f09b798b0b91c2da53d298718"} Mar 17 11:35:41 crc kubenswrapper[4742]: I0317 11:35:41.162478 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f2b889c9-de23-4357-956c-1684e42c64de","Type":"ContainerStarted","Data":"d26d978fa5ec91830f7021a5eb8bea0c786cec56ee95df2a3f361dbc0a282012"} Mar 17 11:35:41 crc kubenswrapper[4742]: I0317 11:35:41.193634 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.19361095 podStartE2EDuration="2.19361095s" podCreationTimestamp="2026-03-17 11:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:35:41.19002578 +0000 UTC m=+1444.316153568" watchObservedRunningTime="2026-03-17 11:35:41.19361095 +0000 UTC m=+1444.319738728" Mar 17 11:35:43 crc kubenswrapper[4742]: I0317 11:35:43.506084 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 17 11:35:43 crc kubenswrapper[4742]: I0317 11:35:43.564200 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 17 11:35:44 crc kubenswrapper[4742]: I0317 11:35:44.231817 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 17 11:35:45 crc kubenswrapper[4742]: I0317 11:35:45.488205 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 17 11:35:45 crc kubenswrapper[4742]: I0317 11:35:45.488324 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 17 11:35:46 crc kubenswrapper[4742]: I0317 11:35:46.506187 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8f6a1398-04d6-4668-9689-17bdbb214850" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 11:35:46 crc kubenswrapper[4742]: I0317 11:35:46.506216 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8f6a1398-04d6-4668-9689-17bdbb214850" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.220:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 11:35:49 crc kubenswrapper[4742]: I0317 11:35:49.560169 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 11:35:49 crc kubenswrapper[4742]: I0317 11:35:49.560982 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 17 11:35:50 crc kubenswrapper[4742]: I0317 11:35:50.574264 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f2b889c9-de23-4357-956c-1684e42c64de" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 11:35:50 crc kubenswrapper[4742]: I0317 11:35:50.574294 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f2b889c9-de23-4357-956c-1684e42c64de" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 17 11:35:53 crc kubenswrapper[4742]: I0317 11:35:53.487632 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 17 11:35:53 crc kubenswrapper[4742]: I0317 11:35:53.488440 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 17 11:35:55 crc kubenswrapper[4742]: I0317 11:35:55.496826 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 17 11:35:55 crc kubenswrapper[4742]: I0317 11:35:55.498079 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 17 11:35:55 crc kubenswrapper[4742]: I0317 11:35:55.509267 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 17 11:35:56 crc kubenswrapper[4742]: I0317 11:35:56.336811 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 17 11:35:56 crc kubenswrapper[4742]: I0317 11:35:56.348131 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 17 11:35:57 crc kubenswrapper[4742]: I0317 11:35:57.560584 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 17 11:35:57 crc kubenswrapper[4742]: I0317 11:35:57.561571 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 17 11:35:59 crc kubenswrapper[4742]: I0317 11:35:59.568116 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 17 11:35:59 crc kubenswrapper[4742]: I0317 11:35:59.577191 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 17 11:35:59 crc kubenswrapper[4742]: I0317 11:35:59.582978 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 17 11:36:00 crc kubenswrapper[4742]: I0317 11:36:00.187411 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562456-zl42f"] Mar 17 11:36:00 crc kubenswrapper[4742]: I0317 11:36:00.189281 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562456-zl42f" Mar 17 11:36:00 crc kubenswrapper[4742]: I0317 11:36:00.191672 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:36:00 crc kubenswrapper[4742]: I0317 11:36:00.192250 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:36:00 crc kubenswrapper[4742]: I0317 11:36:00.192481 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:36:00 crc kubenswrapper[4742]: I0317 11:36:00.196679 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562456-zl42f"] Mar 17 11:36:00 crc kubenswrapper[4742]: I0317 11:36:00.258141 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kksmq\" (UniqueName: \"kubernetes.io/projected/dd3294f4-0a77-467f-8178-8631afe227fe-kube-api-access-kksmq\") pod \"auto-csr-approver-29562456-zl42f\" (UID: \"dd3294f4-0a77-467f-8178-8631afe227fe\") " pod="openshift-infra/auto-csr-approver-29562456-zl42f" Mar 17 11:36:00 crc kubenswrapper[4742]: I0317 11:36:00.360501 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kksmq\" (UniqueName: \"kubernetes.io/projected/dd3294f4-0a77-467f-8178-8631afe227fe-kube-api-access-kksmq\") pod \"auto-csr-approver-29562456-zl42f\" (UID: \"dd3294f4-0a77-467f-8178-8631afe227fe\") " pod="openshift-infra/auto-csr-approver-29562456-zl42f" Mar 17 11:36:00 crc kubenswrapper[4742]: I0317 11:36:00.381085 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kksmq\" (UniqueName: \"kubernetes.io/projected/dd3294f4-0a77-467f-8178-8631afe227fe-kube-api-access-kksmq\") pod \"auto-csr-approver-29562456-zl42f\" (UID: \"dd3294f4-0a77-467f-8178-8631afe227fe\") " pod="openshift-infra/auto-csr-approver-29562456-zl42f" Mar 17 11:36:00 crc kubenswrapper[4742]: I0317 11:36:00.390702 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 17 11:36:00 crc kubenswrapper[4742]: I0317 11:36:00.508998 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562456-zl42f" Mar 17 11:36:00 crc kubenswrapper[4742]: I0317 11:36:00.941950 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562456-zl42f"] Mar 17 11:36:01 crc kubenswrapper[4742]: I0317 11:36:01.399819 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562456-zl42f" event={"ID":"dd3294f4-0a77-467f-8178-8631afe227fe","Type":"ContainerStarted","Data":"e94d4476f22d8a2073feae4ae478fc6f9673b52b7546320653a7852c079ca72d"} Mar 17 11:36:03 crc kubenswrapper[4742]: I0317 11:36:03.429878 4742 generic.go:334] "Generic (PLEG): container finished" podID="dd3294f4-0a77-467f-8178-8631afe227fe" containerID="20127a94628f13a2546341aa2e04fdef5eb4c56906bdb046295e675159e13cf0" exitCode=0 Mar 17 11:36:03 crc kubenswrapper[4742]: I0317 11:36:03.429973 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562456-zl42f" event={"ID":"dd3294f4-0a77-467f-8178-8631afe227fe","Type":"ContainerDied","Data":"20127a94628f13a2546341aa2e04fdef5eb4c56906bdb046295e675159e13cf0"} Mar 17 11:36:04 crc kubenswrapper[4742]: I0317 11:36:04.792252 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562456-zl42f" Mar 17 11:36:04 crc kubenswrapper[4742]: I0317 11:36:04.846793 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kksmq\" (UniqueName: \"kubernetes.io/projected/dd3294f4-0a77-467f-8178-8631afe227fe-kube-api-access-kksmq\") pod \"dd3294f4-0a77-467f-8178-8631afe227fe\" (UID: \"dd3294f4-0a77-467f-8178-8631afe227fe\") " Mar 17 11:36:04 crc kubenswrapper[4742]: I0317 11:36:04.859052 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3294f4-0a77-467f-8178-8631afe227fe-kube-api-access-kksmq" (OuterVolumeSpecName: "kube-api-access-kksmq") pod "dd3294f4-0a77-467f-8178-8631afe227fe" (UID: "dd3294f4-0a77-467f-8178-8631afe227fe"). InnerVolumeSpecName "kube-api-access-kksmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:36:04 crc kubenswrapper[4742]: I0317 11:36:04.950846 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kksmq\" (UniqueName: \"kubernetes.io/projected/dd3294f4-0a77-467f-8178-8631afe227fe-kube-api-access-kksmq\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:05 crc kubenswrapper[4742]: I0317 11:36:05.454307 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562456-zl42f" event={"ID":"dd3294f4-0a77-467f-8178-8631afe227fe","Type":"ContainerDied","Data":"e94d4476f22d8a2073feae4ae478fc6f9673b52b7546320653a7852c079ca72d"} Mar 17 11:36:05 crc kubenswrapper[4742]: I0317 11:36:05.454375 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e94d4476f22d8a2073feae4ae478fc6f9673b52b7546320653a7852c079ca72d" Mar 17 11:36:05 crc kubenswrapper[4742]: I0317 11:36:05.454421 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562456-zl42f" Mar 17 11:36:05 crc kubenswrapper[4742]: I0317 11:36:05.904021 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562450-jzztm"] Mar 17 11:36:05 crc kubenswrapper[4742]: I0317 11:36:05.916571 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562450-jzztm"] Mar 17 11:36:06 crc kubenswrapper[4742]: I0317 11:36:06.681390 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="677f677c-39b4-4713-afc0-57fb6b36a1a3" path="/var/lib/kubelet/pods/677f677c-39b4-4713-afc0-57fb6b36a1a3/volumes" Mar 17 11:36:08 crc kubenswrapper[4742]: I0317 11:36:08.606133 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 11:36:09 crc kubenswrapper[4742]: I0317 11:36:09.494689 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 11:36:12 crc kubenswrapper[4742]: I0317 11:36:12.741091 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" containerName="rabbitmq" containerID="cri-o://f8811158aa410033c4850052e5f64091ae9d78c2cd5b4b4285c898d9d4837c55" gracePeriod=604796 Mar 17 11:36:13 crc kubenswrapper[4742]: I0317 11:36:13.791384 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0d71d306-a987-411e-82fe-e18450aa18a2" containerName="rabbitmq" containerID="cri-o://0f7789cc70ff5ae1940a1e73e599735fcfd8df82cb6befebbe23b70ff21d4d9a" gracePeriod=604796 Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.552171 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rhqg8"] Mar 17 11:36:15 crc kubenswrapper[4742]: E0317 11:36:15.552978 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3294f4-0a77-467f-8178-8631afe227fe" containerName="oc" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.552996 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3294f4-0a77-467f-8178-8631afe227fe" containerName="oc" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.553314 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3294f4-0a77-467f-8178-8631afe227fe" containerName="oc" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.555985 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.577743 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rhqg8"] Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.658914 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac34fc1-9c6c-4ffb-a772-87e33f70a856-utilities\") pod \"redhat-operators-rhqg8\" (UID: \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\") " pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.658966 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac34fc1-9c6c-4ffb-a772-87e33f70a856-catalog-content\") pod \"redhat-operators-rhqg8\" (UID: \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\") " pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.659097 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfb5q\" (UniqueName: \"kubernetes.io/projected/cac34fc1-9c6c-4ffb-a772-87e33f70a856-kube-api-access-lfb5q\") pod \"redhat-operators-rhqg8\" (UID: \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\") " pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.753164 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l7jsk"] Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.763580 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac34fc1-9c6c-4ffb-a772-87e33f70a856-utilities\") pod \"redhat-operators-rhqg8\" (UID: \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\") " pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.763669 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac34fc1-9c6c-4ffb-a772-87e33f70a856-catalog-content\") pod \"redhat-operators-rhqg8\" (UID: \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\") " pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.763883 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfb5q\" (UniqueName: \"kubernetes.io/projected/cac34fc1-9c6c-4ffb-a772-87e33f70a856-kube-api-access-lfb5q\") pod \"redhat-operators-rhqg8\" (UID: \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\") " pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.764266 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7jsk"] Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.764436 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac34fc1-9c6c-4ffb-a772-87e33f70a856-utilities\") pod \"redhat-operators-rhqg8\" (UID: \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\") " pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.764528 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.764706 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac34fc1-9c6c-4ffb-a772-87e33f70a856-catalog-content\") pod \"redhat-operators-rhqg8\" (UID: \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\") " pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.788061 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfb5q\" (UniqueName: \"kubernetes.io/projected/cac34fc1-9c6c-4ffb-a772-87e33f70a856-kube-api-access-lfb5q\") pod \"redhat-operators-rhqg8\" (UID: \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\") " pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.866068 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xndlm\" (UniqueName: \"kubernetes.io/projected/2339ae01-ee38-4d70-a94c-6fab6a31226e-kube-api-access-xndlm\") pod \"redhat-marketplace-l7jsk\" (UID: \"2339ae01-ee38-4d70-a94c-6fab6a31226e\") " pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.866509 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2339ae01-ee38-4d70-a94c-6fab6a31226e-catalog-content\") pod \"redhat-marketplace-l7jsk\" (UID: \"2339ae01-ee38-4d70-a94c-6fab6a31226e\") " pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.866528 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2339ae01-ee38-4d70-a94c-6fab6a31226e-utilities\") pod \"redhat-marketplace-l7jsk\" (UID: \"2339ae01-ee38-4d70-a94c-6fab6a31226e\") " pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.882060 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.968873 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2339ae01-ee38-4d70-a94c-6fab6a31226e-catalog-content\") pod \"redhat-marketplace-l7jsk\" (UID: \"2339ae01-ee38-4d70-a94c-6fab6a31226e\") " pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.968936 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2339ae01-ee38-4d70-a94c-6fab6a31226e-utilities\") pod \"redhat-marketplace-l7jsk\" (UID: \"2339ae01-ee38-4d70-a94c-6fab6a31226e\") " pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.969032 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xndlm\" (UniqueName: \"kubernetes.io/projected/2339ae01-ee38-4d70-a94c-6fab6a31226e-kube-api-access-xndlm\") pod \"redhat-marketplace-l7jsk\" (UID: \"2339ae01-ee38-4d70-a94c-6fab6a31226e\") " pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.969814 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2339ae01-ee38-4d70-a94c-6fab6a31226e-utilities\") pod \"redhat-marketplace-l7jsk\" (UID: \"2339ae01-ee38-4d70-a94c-6fab6a31226e\") " pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:15 crc kubenswrapper[4742]: I0317 11:36:15.969968 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2339ae01-ee38-4d70-a94c-6fab6a31226e-catalog-content\") pod \"redhat-marketplace-l7jsk\" (UID: \"2339ae01-ee38-4d70-a94c-6fab6a31226e\") " pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:16 crc kubenswrapper[4742]: I0317 11:36:16.000145 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xndlm\" (UniqueName: \"kubernetes.io/projected/2339ae01-ee38-4d70-a94c-6fab6a31226e-kube-api-access-xndlm\") pod \"redhat-marketplace-l7jsk\" (UID: \"2339ae01-ee38-4d70-a94c-6fab6a31226e\") " pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:16 crc kubenswrapper[4742]: I0317 11:36:16.086705 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:16 crc kubenswrapper[4742]: I0317 11:36:16.174152 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rhqg8"] Mar 17 11:36:16 crc kubenswrapper[4742]: I0317 11:36:16.541542 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7jsk"] Mar 17 11:36:16 crc kubenswrapper[4742]: W0317 11:36:16.544997 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2339ae01_ee38_4d70_a94c_6fab6a31226e.slice/crio-23dbb275b467b6bfa61c6ca06364e6f449f8fd5526c2617d450f4f8f822e66c0 WatchSource:0}: Error finding container 23dbb275b467b6bfa61c6ca06364e6f449f8fd5526c2617d450f4f8f822e66c0: Status 404 returned error can't find the container with id 23dbb275b467b6bfa61c6ca06364e6f449f8fd5526c2617d450f4f8f822e66c0 Mar 17 11:36:16 crc kubenswrapper[4742]: I0317 11:36:16.575356 4742 generic.go:334] "Generic (PLEG): container finished" podID="cac34fc1-9c6c-4ffb-a772-87e33f70a856" containerID="5ebba01f69ca93513bf85116ccc2cc17fb509c39b51b3c9c9c44b72da6402b05" exitCode=0 Mar 17 11:36:16 crc kubenswrapper[4742]: I0317 11:36:16.575396 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhqg8" event={"ID":"cac34fc1-9c6c-4ffb-a772-87e33f70a856","Type":"ContainerDied","Data":"5ebba01f69ca93513bf85116ccc2cc17fb509c39b51b3c9c9c44b72da6402b05"} Mar 17 11:36:16 crc kubenswrapper[4742]: I0317 11:36:16.575438 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhqg8" event={"ID":"cac34fc1-9c6c-4ffb-a772-87e33f70a856","Type":"ContainerStarted","Data":"537df5638144c7ede68b3ed04b22d0720b1348daf9ea49492c03a29d479bf53d"} Mar 17 11:36:16 crc kubenswrapper[4742]: I0317 11:36:16.577894 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7jsk" event={"ID":"2339ae01-ee38-4d70-a94c-6fab6a31226e","Type":"ContainerStarted","Data":"23dbb275b467b6bfa61c6ca06364e6f449f8fd5526c2617d450f4f8f822e66c0"} Mar 17 11:36:17 crc kubenswrapper[4742]: I0317 11:36:17.590991 4742 generic.go:334] "Generic (PLEG): container finished" podID="2339ae01-ee38-4d70-a94c-6fab6a31226e" containerID="e5e2aa7f2cb721278a331431fcb7ba42894cd6da8f9b85bd3a1678d6a5e8da19" exitCode=0 Mar 17 11:36:17 crc kubenswrapper[4742]: I0317 11:36:17.591241 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7jsk" event={"ID":"2339ae01-ee38-4d70-a94c-6fab6a31226e","Type":"ContainerDied","Data":"e5e2aa7f2cb721278a331431fcb7ba42894cd6da8f9b85bd3a1678d6a5e8da19"} Mar 17 11:36:18 crc kubenswrapper[4742]: I0317 11:36:18.603348 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhqg8" event={"ID":"cac34fc1-9c6c-4ffb-a772-87e33f70a856","Type":"ContainerStarted","Data":"d403db49dbbfc88e0bff73873ce3567162423cef33907fbf81ab163ad9f266d5"} Mar 17 11:36:18 crc kubenswrapper[4742]: I0317 11:36:18.956448 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0d71d306-a987-411e-82fe-e18450aa18a2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 17 11:36:19 crc kubenswrapper[4742]: I0317 11:36:19.281010 4742 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 17 11:36:19 crc kubenswrapper[4742]: I0317 11:36:19.622156 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7jsk" event={"ID":"2339ae01-ee38-4d70-a94c-6fab6a31226e","Type":"ContainerStarted","Data":"48ee335a0b47251921012833eef14da3d2984a0107cc7544a335282cbeb8de4c"} Mar 17 11:36:20 crc kubenswrapper[4742]: I0317 11:36:20.636332 4742 generic.go:334] "Generic (PLEG): container finished" podID="2339ae01-ee38-4d70-a94c-6fab6a31226e" containerID="48ee335a0b47251921012833eef14da3d2984a0107cc7544a335282cbeb8de4c" exitCode=0 Mar 17 11:36:20 crc kubenswrapper[4742]: I0317 11:36:20.636775 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7jsk" event={"ID":"2339ae01-ee38-4d70-a94c-6fab6a31226e","Type":"ContainerDied","Data":"48ee335a0b47251921012833eef14da3d2984a0107cc7544a335282cbeb8de4c"} Mar 17 11:36:20 crc kubenswrapper[4742]: I0317 11:36:20.645597 4742 generic.go:334] "Generic (PLEG): container finished" podID="cac34fc1-9c6c-4ffb-a772-87e33f70a856" containerID="d403db49dbbfc88e0bff73873ce3567162423cef33907fbf81ab163ad9f266d5" exitCode=0 Mar 17 11:36:20 crc kubenswrapper[4742]: I0317 11:36:20.645663 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhqg8" event={"ID":"cac34fc1-9c6c-4ffb-a772-87e33f70a856","Type":"ContainerDied","Data":"d403db49dbbfc88e0bff73873ce3567162423cef33907fbf81ab163ad9f266d5"} Mar 17 11:36:20 crc kubenswrapper[4742]: I0317 11:36:20.654619 4742 generic.go:334] "Generic (PLEG): container finished" podID="dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" containerID="f8811158aa410033c4850052e5f64091ae9d78c2cd5b4b4285c898d9d4837c55" exitCode=0 Mar 17 11:36:20 crc kubenswrapper[4742]: I0317 11:36:20.654654 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6","Type":"ContainerDied","Data":"f8811158aa410033c4850052e5f64091ae9d78c2cd5b4b4285c898d9d4837c55"} Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.237690 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.326948 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-confd\") pod \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.327010 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-erlang-cookie\") pod \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.327034 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndv99\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-kube-api-access-ndv99\") pod \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.327058 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-server-conf\") pod \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.327094 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-erlang-cookie-secret\") pod \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.327109 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-tls\") pod \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.327864 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-config-data\") pod \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.327944 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-pod-info\") pod \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.328025 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-plugins-conf\") pod \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.328125 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-plugins\") pod \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.328140 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\" (UID: \"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.329704 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" (UID: "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.330531 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" (UID: "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.330549 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" (UID: "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.343207 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" (UID: "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.374335 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" (UID: "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.374401 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" (UID: "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.378431 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-kube-api-access-ndv99" (OuterVolumeSpecName: "kube-api-access-ndv99") pod "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" (UID: "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6"). InnerVolumeSpecName "kube-api-access-ndv99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.382488 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-pod-info" (OuterVolumeSpecName: "pod-info") pod "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" (UID: "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.382630 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-phn6z"] Mar 17 11:36:21 crc kubenswrapper[4742]: E0317 11:36:21.383003 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" containerName="rabbitmq" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.383015 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" containerName="rabbitmq" Mar 17 11:36:21 crc kubenswrapper[4742]: E0317 11:36:21.383049 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" containerName="setup-container" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.383056 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" containerName="setup-container" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.383225 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" containerName="rabbitmq" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.386270 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.404638 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.421886 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-phn6z"] Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.430762 4742 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.430803 4742 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.430836 4742 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.430847 4742 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.430857 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndv99\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-kube-api-access-ndv99\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.430867 4742 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.430874 4742 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.430882 4742 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-pod-info\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.439096 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-config-data" (OuterVolumeSpecName: "config-data") pod "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" (UID: "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.485844 4742 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.528262 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-server-conf" (OuterVolumeSpecName: "server-conf") pod "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" (UID: "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.531926 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.531974 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.532112 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" (UID: "dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.532134 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.532292 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-config\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.532347 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmr5n\" (UniqueName: \"kubernetes.io/projected/204902d3-4729-4f78-ba39-d5495676a514-kube-api-access-dmr5n\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.532392 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.532461 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.532590 4742 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.532607 4742 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.532621 4742 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-server-conf\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.532631 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.634082 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-config\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.634382 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmr5n\" (UniqueName: \"kubernetes.io/projected/204902d3-4729-4f78-ba39-d5495676a514-kube-api-access-dmr5n\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.634434 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.634470 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.634548 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.634603 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.634832 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.635181 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-config\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.635565 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.635982 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.636264 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.636270 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.636496 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.654028 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmr5n\" (UniqueName: \"kubernetes.io/projected/204902d3-4729-4f78-ba39-d5495676a514-kube-api-access-dmr5n\") pod \"dnsmasq-dns-79bd4cc8c9-phn6z\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.672406 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7jsk" event={"ID":"2339ae01-ee38-4d70-a94c-6fab6a31226e","Type":"ContainerStarted","Data":"84d3566d01a4360e53d8e806d907d1de6bc02110b210431578cc6686f5d72fc5"} Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.692375 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l7jsk" podStartSLOduration=2.962644215 podStartE2EDuration="6.692357763s" podCreationTimestamp="2026-03-17 11:36:15 +0000 UTC" firstStartedPulling="2026-03-17 11:36:17.593468264 +0000 UTC m=+1480.719596042" lastFinishedPulling="2026-03-17 11:36:21.323181832 +0000 UTC m=+1484.449309590" observedRunningTime="2026-03-17 11:36:21.685232144 +0000 UTC m=+1484.811359912" watchObservedRunningTime="2026-03-17 11:36:21.692357763 +0000 UTC m=+1484.818485521" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.705218 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhqg8" event={"ID":"cac34fc1-9c6c-4ffb-a772-87e33f70a856","Type":"ContainerStarted","Data":"41a3eda8f97221db7de1900157040679f7b22e52083e55fea7fe5f48089b2649"} Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.708590 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6","Type":"ContainerDied","Data":"8a77c3656f9054dd75e53300541f1f19547cc5f8d1cd2c159960b51c828a299a"} Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.708641 4742 scope.go:117] "RemoveContainer" containerID="f8811158aa410033c4850052e5f64091ae9d78c2cd5b4b4285c898d9d4837c55" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.708760 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.717374 4742 generic.go:334] "Generic (PLEG): container finished" podID="0d71d306-a987-411e-82fe-e18450aa18a2" containerID="0f7789cc70ff5ae1940a1e73e599735fcfd8df82cb6befebbe23b70ff21d4d9a" exitCode=0 Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.717421 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d71d306-a987-411e-82fe-e18450aa18a2","Type":"ContainerDied","Data":"0f7789cc70ff5ae1940a1e73e599735fcfd8df82cb6befebbe23b70ff21d4d9a"} Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.731546 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rhqg8" podStartSLOduration=2.195703935 podStartE2EDuration="6.731528205s" podCreationTimestamp="2026-03-17 11:36:15 +0000 UTC" firstStartedPulling="2026-03-17 11:36:16.577009679 +0000 UTC m=+1479.703137437" lastFinishedPulling="2026-03-17 11:36:21.112833959 +0000 UTC m=+1484.238961707" observedRunningTime="2026-03-17 11:36:21.7256104 +0000 UTC m=+1484.851738168" watchObservedRunningTime="2026-03-17 11:36:21.731528205 +0000 UTC m=+1484.857655953" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.740842 4742 scope.go:117] "RemoveContainer" containerID="2b56274b6b78ca4e5410d6fa294dba941d61ff2a15e2f2b60bc50b901df2e13d" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.753858 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.767587 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.777570 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.786978 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 11:36:21 crc kubenswrapper[4742]: E0317 11:36:21.787446 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d71d306-a987-411e-82fe-e18450aa18a2" containerName="rabbitmq" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.787472 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d71d306-a987-411e-82fe-e18450aa18a2" containerName="rabbitmq" Mar 17 11:36:21 crc kubenswrapper[4742]: E0317 11:36:21.787494 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d71d306-a987-411e-82fe-e18450aa18a2" containerName="setup-container" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.787503 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d71d306-a987-411e-82fe-e18450aa18a2" containerName="setup-container" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.787737 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d71d306-a987-411e-82fe-e18450aa18a2" containerName="rabbitmq" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.792516 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.796458 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.796703 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.796871 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.797125 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.797286 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.797443 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ls6t5" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.797614 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.800256 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.825730 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.838537 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr5zf\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-kube-api-access-rr5zf\") pod \"0d71d306-a987-411e-82fe-e18450aa18a2\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.838602 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-confd\") pod \"0d71d306-a987-411e-82fe-e18450aa18a2\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.838659 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d71d306-a987-411e-82fe-e18450aa18a2-erlang-cookie-secret\") pod \"0d71d306-a987-411e-82fe-e18450aa18a2\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.838679 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"0d71d306-a987-411e-82fe-e18450aa18a2\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.838719 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-plugins\") pod \"0d71d306-a987-411e-82fe-e18450aa18a2\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.838746 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-server-conf\") pod \"0d71d306-a987-411e-82fe-e18450aa18a2\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.838784 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-erlang-cookie\") pod \"0d71d306-a987-411e-82fe-e18450aa18a2\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.838861 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-tls\") pod \"0d71d306-a987-411e-82fe-e18450aa18a2\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.838880 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d71d306-a987-411e-82fe-e18450aa18a2-pod-info\") pod \"0d71d306-a987-411e-82fe-e18450aa18a2\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.838965 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-config-data\") pod \"0d71d306-a987-411e-82fe-e18450aa18a2\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.839023 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-plugins-conf\") pod \"0d71d306-a987-411e-82fe-e18450aa18a2\" (UID: \"0d71d306-a987-411e-82fe-e18450aa18a2\") " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.845111 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0d71d306-a987-411e-82fe-e18450aa18a2" (UID: "0d71d306-a987-411e-82fe-e18450aa18a2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.845467 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-kube-api-access-rr5zf" (OuterVolumeSpecName: "kube-api-access-rr5zf") pod "0d71d306-a987-411e-82fe-e18450aa18a2" (UID: "0d71d306-a987-411e-82fe-e18450aa18a2"). InnerVolumeSpecName "kube-api-access-rr5zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.849178 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0d71d306-a987-411e-82fe-e18450aa18a2" (UID: "0d71d306-a987-411e-82fe-e18450aa18a2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.853092 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d71d306-a987-411e-82fe-e18450aa18a2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0d71d306-a987-411e-82fe-e18450aa18a2" (UID: "0d71d306-a987-411e-82fe-e18450aa18a2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.854092 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0d71d306-a987-411e-82fe-e18450aa18a2" (UID: "0d71d306-a987-411e-82fe-e18450aa18a2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.858625 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0d71d306-a987-411e-82fe-e18450aa18a2-pod-info" (OuterVolumeSpecName: "pod-info") pod "0d71d306-a987-411e-82fe-e18450aa18a2" (UID: "0d71d306-a987-411e-82fe-e18450aa18a2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.858756 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "0d71d306-a987-411e-82fe-e18450aa18a2" (UID: "0d71d306-a987-411e-82fe-e18450aa18a2"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.864875 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0d71d306-a987-411e-82fe-e18450aa18a2" (UID: "0d71d306-a987-411e-82fe-e18450aa18a2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.893394 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-config-data" (OuterVolumeSpecName: "config-data") pod "0d71d306-a987-411e-82fe-e18450aa18a2" (UID: "0d71d306-a987-411e-82fe-e18450aa18a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.934046 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-server-conf" (OuterVolumeSpecName: "server-conf") pod "0d71d306-a987-411e-82fe-e18450aa18a2" (UID: "0d71d306-a987-411e-82fe-e18450aa18a2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940546 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4e8c9887-8315-444e-b3dd-9753e83f83fa-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940637 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4e8c9887-8315-444e-b3dd-9753e83f83fa-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940672 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4e8c9887-8315-444e-b3dd-9753e83f83fa-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940691 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4e8c9887-8315-444e-b3dd-9753e83f83fa-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940708 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4e8c9887-8315-444e-b3dd-9753e83f83fa-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940731 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4e8c9887-8315-444e-b3dd-9753e83f83fa-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940786 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e8c9887-8315-444e-b3dd-9753e83f83fa-config-data\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940811 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4e8c9887-8315-444e-b3dd-9753e83f83fa-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940849 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940876 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq852\" (UniqueName: \"kubernetes.io/projected/4e8c9887-8315-444e-b3dd-9753e83f83fa-kube-api-access-dq852\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940894 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4e8c9887-8315-444e-b3dd-9753e83f83fa-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940959 4742 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940970 4742 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940979 4742 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0d71d306-a987-411e-82fe-e18450aa18a2-pod-info\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.940987 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.941000 4742 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.941008 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr5zf\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-kube-api-access-rr5zf\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.941027 4742 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.941036 4742 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0d71d306-a987-411e-82fe-e18450aa18a2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.941045 4742 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.941053 4742 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0d71d306-a987-411e-82fe-e18450aa18a2-server-conf\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:21 crc kubenswrapper[4742]: I0317 11:36:21.974310 4742 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.014493 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0d71d306-a987-411e-82fe-e18450aa18a2" (UID: "0d71d306-a987-411e-82fe-e18450aa18a2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.044995 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4e8c9887-8315-444e-b3dd-9753e83f83fa-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.045044 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4e8c9887-8315-444e-b3dd-9753e83f83fa-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.045074 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4e8c9887-8315-444e-b3dd-9753e83f83fa-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.045106 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4e8c9887-8315-444e-b3dd-9753e83f83fa-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.045150 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e8c9887-8315-444e-b3dd-9753e83f83fa-config-data\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.045188 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4e8c9887-8315-444e-b3dd-9753e83f83fa-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.045243 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.045280 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq852\" (UniqueName: \"kubernetes.io/projected/4e8c9887-8315-444e-b3dd-9753e83f83fa-kube-api-access-dq852\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.045304 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4e8c9887-8315-444e-b3dd-9753e83f83fa-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.045342 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4e8c9887-8315-444e-b3dd-9753e83f83fa-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.045418 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4e8c9887-8315-444e-b3dd-9753e83f83fa-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.045438 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4e8c9887-8315-444e-b3dd-9753e83f83fa-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.045683 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4e8c9887-8315-444e-b3dd-9753e83f83fa-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.045758 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.046617 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4e8c9887-8315-444e-b3dd-9753e83f83fa-config-data\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.046919 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4e8c9887-8315-444e-b3dd-9753e83f83fa-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.048035 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4e8c9887-8315-444e-b3dd-9753e83f83fa-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.048849 4742 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0d71d306-a987-411e-82fe-e18450aa18a2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.048873 4742 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.050761 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4e8c9887-8315-444e-b3dd-9753e83f83fa-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.051536 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4e8c9887-8315-444e-b3dd-9753e83f83fa-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.051744 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4e8c9887-8315-444e-b3dd-9753e83f83fa-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.054803 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4e8c9887-8315-444e-b3dd-9753e83f83fa-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.060572 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq852\" (UniqueName: \"kubernetes.io/projected/4e8c9887-8315-444e-b3dd-9753e83f83fa-kube-api-access-dq852\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.080555 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"4e8c9887-8315-444e-b3dd-9753e83f83fa\") " pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.131326 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.374961 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-phn6z"] Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.683241 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6" path="/var/lib/kubelet/pods/dfda8bfd-1185-4aaa-92d8-4fdbc40c81d6/volumes" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.729692 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0d71d306-a987-411e-82fe-e18450aa18a2","Type":"ContainerDied","Data":"c1aed7c500e49f2ba7a5ee494e2df33faded14c193f7d81894b2b827b90ee903"} Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.729736 4742 scope.go:117] "RemoveContainer" containerID="0f7789cc70ff5ae1940a1e73e599735fcfd8df82cb6befebbe23b70ff21d4d9a" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.729862 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.733235 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" event={"ID":"204902d3-4729-4f78-ba39-d5495676a514","Type":"ContainerStarted","Data":"a564811358144cc61faf392366df2fe8f396b29e2f40ffb41049b88d1665bb2e"} Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.733269 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" event={"ID":"204902d3-4729-4f78-ba39-d5495676a514","Type":"ContainerStarted","Data":"365711c795d727754be779f1c5c4aa0ac401ad7790ec2326701fb86ca47ba29d"} Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.759664 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.765154 4742 scope.go:117] "RemoveContainer" containerID="ae2be08fc5ec8464794b9d028f78ef7f5e9da6e8e3861cfa52e24654763af4af" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.782056 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.824120 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.825778 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.833050 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.844744 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.852507 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2l6fw" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.852757 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.856431 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.856696 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.856807 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.857067 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.857553 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.876410 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c10e471-26c3-41ec-bf47-a5edf33c173d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.876486 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c10e471-26c3-41ec-bf47-a5edf33c173d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.876551 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c10e471-26c3-41ec-bf47-a5edf33c173d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.876584 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.876922 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c10e471-26c3-41ec-bf47-a5edf33c173d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.876976 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c10e471-26c3-41ec-bf47-a5edf33c173d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.877023 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c10e471-26c3-41ec-bf47-a5edf33c173d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.877055 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c10e471-26c3-41ec-bf47-a5edf33c173d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.877083 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnw5c\" (UniqueName: \"kubernetes.io/projected/6c10e471-26c3-41ec-bf47-a5edf33c173d-kube-api-access-mnw5c\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.877117 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c10e471-26c3-41ec-bf47-a5edf33c173d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.877165 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c10e471-26c3-41ec-bf47-a5edf33c173d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.978671 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c10e471-26c3-41ec-bf47-a5edf33c173d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.978724 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c10e471-26c3-41ec-bf47-a5edf33c173d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.978761 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c10e471-26c3-41ec-bf47-a5edf33c173d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.978794 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c10e471-26c3-41ec-bf47-a5edf33c173d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.978850 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnw5c\" (UniqueName: \"kubernetes.io/projected/6c10e471-26c3-41ec-bf47-a5edf33c173d-kube-api-access-mnw5c\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.979300 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c10e471-26c3-41ec-bf47-a5edf33c173d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.979407 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c10e471-26c3-41ec-bf47-a5edf33c173d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.979480 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c10e471-26c3-41ec-bf47-a5edf33c173d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.979855 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c10e471-26c3-41ec-bf47-a5edf33c173d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.980216 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c10e471-26c3-41ec-bf47-a5edf33c173d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.980363 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c10e471-26c3-41ec-bf47-a5edf33c173d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.980448 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c10e471-26c3-41ec-bf47-a5edf33c173d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.980485 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c10e471-26c3-41ec-bf47-a5edf33c173d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.980934 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c10e471-26c3-41ec-bf47-a5edf33c173d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.980969 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.981646 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c10e471-26c3-41ec-bf47-a5edf33c173d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.981471 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.984425 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c10e471-26c3-41ec-bf47-a5edf33c173d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.988315 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c10e471-26c3-41ec-bf47-a5edf33c173d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.988830 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c10e471-26c3-41ec-bf47-a5edf33c173d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.989277 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c10e471-26c3-41ec-bf47-a5edf33c173d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:22 crc kubenswrapper[4742]: I0317 11:36:22.999144 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnw5c\" (UniqueName: \"kubernetes.io/projected/6c10e471-26c3-41ec-bf47-a5edf33c173d-kube-api-access-mnw5c\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:23 crc kubenswrapper[4742]: I0317 11:36:23.016109 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6c10e471-26c3-41ec-bf47-a5edf33c173d\") " pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:23 crc kubenswrapper[4742]: I0317 11:36:23.160259 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:36:23 crc kubenswrapper[4742]: I0317 11:36:23.649171 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 17 11:36:23 crc kubenswrapper[4742]: W0317 11:36:23.657153 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c10e471_26c3_41ec_bf47_a5edf33c173d.slice/crio-c159a0e43f82f2e102663054557bb91ecf28646cc784efa158cbdf5c743c73cd WatchSource:0}: Error finding container c159a0e43f82f2e102663054557bb91ecf28646cc784efa158cbdf5c743c73cd: Status 404 returned error can't find the container with id c159a0e43f82f2e102663054557bb91ecf28646cc784efa158cbdf5c743c73cd Mar 17 11:36:23 crc kubenswrapper[4742]: I0317 11:36:23.742849 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4e8c9887-8315-444e-b3dd-9753e83f83fa","Type":"ContainerStarted","Data":"4e0ab2888ea2544987ce53fac20a9e64c5bbcbf429579fab66b7a1e237446b68"} Mar 17 11:36:23 crc kubenswrapper[4742]: I0317 11:36:23.744441 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c10e471-26c3-41ec-bf47-a5edf33c173d","Type":"ContainerStarted","Data":"c159a0e43f82f2e102663054557bb91ecf28646cc784efa158cbdf5c743c73cd"} Mar 17 11:36:23 crc kubenswrapper[4742]: I0317 11:36:23.748775 4742 generic.go:334] "Generic (PLEG): container finished" podID="204902d3-4729-4f78-ba39-d5495676a514" containerID="a564811358144cc61faf392366df2fe8f396b29e2f40ffb41049b88d1665bb2e" exitCode=0 Mar 17 11:36:23 crc kubenswrapper[4742]: I0317 11:36:23.748841 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" event={"ID":"204902d3-4729-4f78-ba39-d5495676a514","Type":"ContainerDied","Data":"a564811358144cc61faf392366df2fe8f396b29e2f40ffb41049b88d1665bb2e"} Mar 17 11:36:24 crc kubenswrapper[4742]: I0317 11:36:24.675726 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d71d306-a987-411e-82fe-e18450aa18a2" path="/var/lib/kubelet/pods/0d71d306-a987-411e-82fe-e18450aa18a2/volumes" Mar 17 11:36:24 crc kubenswrapper[4742]: I0317 11:36:24.762183 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" event={"ID":"204902d3-4729-4f78-ba39-d5495676a514","Type":"ContainerStarted","Data":"5c6bfb504b711fa3c0c16f626be67dc1479ce3838de3a9caaae00e79da969e41"} Mar 17 11:36:24 crc kubenswrapper[4742]: I0317 11:36:24.763006 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:24 crc kubenswrapper[4742]: I0317 11:36:24.764058 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4e8c9887-8315-444e-b3dd-9753e83f83fa","Type":"ContainerStarted","Data":"ecf1386a52ef8ac6d953a6b6c75367f0adf70e1490942177d2c4dcbd922f8051"} Mar 17 11:36:24 crc kubenswrapper[4742]: I0317 11:36:24.791173 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" podStartSLOduration=3.791145025 podStartE2EDuration="3.791145025s" podCreationTimestamp="2026-03-17 11:36:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:36:24.782585376 +0000 UTC m=+1487.908713174" watchObservedRunningTime="2026-03-17 11:36:24.791145025 +0000 UTC m=+1487.917272783" Mar 17 11:36:25 crc kubenswrapper[4742]: I0317 11:36:25.883660 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:25 crc kubenswrapper[4742]: I0317 11:36:25.884156 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:26 crc kubenswrapper[4742]: I0317 11:36:26.088554 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:26 crc kubenswrapper[4742]: I0317 11:36:26.088603 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:26 crc kubenswrapper[4742]: I0317 11:36:26.144522 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:26 crc kubenswrapper[4742]: I0317 11:36:26.795758 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c10e471-26c3-41ec-bf47-a5edf33c173d","Type":"ContainerStarted","Data":"61ad8c14db28965b0e7ec192ccbd1132aef249cfd1e5e01434a859b98cc2f305"} Mar 17 11:36:26 crc kubenswrapper[4742]: I0317 11:36:26.851861 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:26 crc kubenswrapper[4742]: I0317 11:36:26.899627 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7jsk"] Mar 17 11:36:26 crc kubenswrapper[4742]: I0317 11:36:26.952607 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rhqg8" podUID="cac34fc1-9c6c-4ffb-a772-87e33f70a856" containerName="registry-server" probeResult="failure" output=< Mar 17 11:36:26 crc kubenswrapper[4742]: timeout: failed to connect service ":50051" within 1s Mar 17 11:36:26 crc kubenswrapper[4742]: > Mar 17 11:36:28 crc kubenswrapper[4742]: I0317 11:36:28.811165 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l7jsk" podUID="2339ae01-ee38-4d70-a94c-6fab6a31226e" containerName="registry-server" containerID="cri-o://84d3566d01a4360e53d8e806d907d1de6bc02110b210431578cc6686f5d72fc5" gracePeriod=2 Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.395644 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.515387 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2339ae01-ee38-4d70-a94c-6fab6a31226e-utilities\") pod \"2339ae01-ee38-4d70-a94c-6fab6a31226e\" (UID: \"2339ae01-ee38-4d70-a94c-6fab6a31226e\") " Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.515530 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2339ae01-ee38-4d70-a94c-6fab6a31226e-catalog-content\") pod \"2339ae01-ee38-4d70-a94c-6fab6a31226e\" (UID: \"2339ae01-ee38-4d70-a94c-6fab6a31226e\") " Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.515675 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xndlm\" (UniqueName: \"kubernetes.io/projected/2339ae01-ee38-4d70-a94c-6fab6a31226e-kube-api-access-xndlm\") pod \"2339ae01-ee38-4d70-a94c-6fab6a31226e\" (UID: \"2339ae01-ee38-4d70-a94c-6fab6a31226e\") " Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.516249 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2339ae01-ee38-4d70-a94c-6fab6a31226e-utilities" (OuterVolumeSpecName: "utilities") pod "2339ae01-ee38-4d70-a94c-6fab6a31226e" (UID: "2339ae01-ee38-4d70-a94c-6fab6a31226e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.524613 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2339ae01-ee38-4d70-a94c-6fab6a31226e-kube-api-access-xndlm" (OuterVolumeSpecName: "kube-api-access-xndlm") pod "2339ae01-ee38-4d70-a94c-6fab6a31226e" (UID: "2339ae01-ee38-4d70-a94c-6fab6a31226e"). InnerVolumeSpecName "kube-api-access-xndlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.561632 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2339ae01-ee38-4d70-a94c-6fab6a31226e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2339ae01-ee38-4d70-a94c-6fab6a31226e" (UID: "2339ae01-ee38-4d70-a94c-6fab6a31226e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.618915 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xndlm\" (UniqueName: \"kubernetes.io/projected/2339ae01-ee38-4d70-a94c-6fab6a31226e-kube-api-access-xndlm\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.618950 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2339ae01-ee38-4d70-a94c-6fab6a31226e-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.618983 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2339ae01-ee38-4d70-a94c-6fab6a31226e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.830888 4742 generic.go:334] "Generic (PLEG): container finished" podID="2339ae01-ee38-4d70-a94c-6fab6a31226e" containerID="84d3566d01a4360e53d8e806d907d1de6bc02110b210431578cc6686f5d72fc5" exitCode=0 Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.830996 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7jsk" event={"ID":"2339ae01-ee38-4d70-a94c-6fab6a31226e","Type":"ContainerDied","Data":"84d3566d01a4360e53d8e806d907d1de6bc02110b210431578cc6686f5d72fc5"} Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.831305 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l7jsk" event={"ID":"2339ae01-ee38-4d70-a94c-6fab6a31226e","Type":"ContainerDied","Data":"23dbb275b467b6bfa61c6ca06364e6f449f8fd5526c2617d450f4f8f822e66c0"} Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.831337 4742 scope.go:117] "RemoveContainer" containerID="84d3566d01a4360e53d8e806d907d1de6bc02110b210431578cc6686f5d72fc5" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.831067 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l7jsk" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.882424 4742 scope.go:117] "RemoveContainer" containerID="48ee335a0b47251921012833eef14da3d2984a0107cc7544a335282cbeb8de4c" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.896230 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7jsk"] Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.904530 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l7jsk"] Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.916420 4742 scope.go:117] "RemoveContainer" containerID="e5e2aa7f2cb721278a331431fcb7ba42894cd6da8f9b85bd3a1678d6a5e8da19" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.992374 4742 scope.go:117] "RemoveContainer" containerID="84d3566d01a4360e53d8e806d907d1de6bc02110b210431578cc6686f5d72fc5" Mar 17 11:36:29 crc kubenswrapper[4742]: E0317 11:36:29.995298 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d3566d01a4360e53d8e806d907d1de6bc02110b210431578cc6686f5d72fc5\": container with ID starting with 84d3566d01a4360e53d8e806d907d1de6bc02110b210431578cc6686f5d72fc5 not found: ID does not exist" containerID="84d3566d01a4360e53d8e806d907d1de6bc02110b210431578cc6686f5d72fc5" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.995367 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d3566d01a4360e53d8e806d907d1de6bc02110b210431578cc6686f5d72fc5"} err="failed to get container status \"84d3566d01a4360e53d8e806d907d1de6bc02110b210431578cc6686f5d72fc5\": rpc error: code = NotFound desc = could not find container \"84d3566d01a4360e53d8e806d907d1de6bc02110b210431578cc6686f5d72fc5\": container with ID starting with 84d3566d01a4360e53d8e806d907d1de6bc02110b210431578cc6686f5d72fc5 not found: ID does not exist" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.995414 4742 scope.go:117] "RemoveContainer" containerID="48ee335a0b47251921012833eef14da3d2984a0107cc7544a335282cbeb8de4c" Mar 17 11:36:29 crc kubenswrapper[4742]: E0317 11:36:29.995885 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48ee335a0b47251921012833eef14da3d2984a0107cc7544a335282cbeb8de4c\": container with ID starting with 48ee335a0b47251921012833eef14da3d2984a0107cc7544a335282cbeb8de4c not found: ID does not exist" containerID="48ee335a0b47251921012833eef14da3d2984a0107cc7544a335282cbeb8de4c" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.995975 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48ee335a0b47251921012833eef14da3d2984a0107cc7544a335282cbeb8de4c"} err="failed to get container status \"48ee335a0b47251921012833eef14da3d2984a0107cc7544a335282cbeb8de4c\": rpc error: code = NotFound desc = could not find container \"48ee335a0b47251921012833eef14da3d2984a0107cc7544a335282cbeb8de4c\": container with ID starting with 48ee335a0b47251921012833eef14da3d2984a0107cc7544a335282cbeb8de4c not found: ID does not exist" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.996008 4742 scope.go:117] "RemoveContainer" containerID="e5e2aa7f2cb721278a331431fcb7ba42894cd6da8f9b85bd3a1678d6a5e8da19" Mar 17 11:36:29 crc kubenswrapper[4742]: E0317 11:36:29.996750 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5e2aa7f2cb721278a331431fcb7ba42894cd6da8f9b85bd3a1678d6a5e8da19\": container with ID starting with e5e2aa7f2cb721278a331431fcb7ba42894cd6da8f9b85bd3a1678d6a5e8da19 not found: ID does not exist" containerID="e5e2aa7f2cb721278a331431fcb7ba42894cd6da8f9b85bd3a1678d6a5e8da19" Mar 17 11:36:29 crc kubenswrapper[4742]: I0317 11:36:29.996865 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5e2aa7f2cb721278a331431fcb7ba42894cd6da8f9b85bd3a1678d6a5e8da19"} err="failed to get container status \"e5e2aa7f2cb721278a331431fcb7ba42894cd6da8f9b85bd3a1678d6a5e8da19\": rpc error: code = NotFound desc = could not find container \"e5e2aa7f2cb721278a331431fcb7ba42894cd6da8f9b85bd3a1678d6a5e8da19\": container with ID starting with e5e2aa7f2cb721278a331431fcb7ba42894cd6da8f9b85bd3a1678d6a5e8da19 not found: ID does not exist" Mar 17 11:36:30 crc kubenswrapper[4742]: I0317 11:36:30.675386 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2339ae01-ee38-4d70-a94c-6fab6a31226e" path="/var/lib/kubelet/pods/2339ae01-ee38-4d70-a94c-6fab6a31226e/volumes" Mar 17 11:36:31 crc kubenswrapper[4742]: I0317 11:36:31.828215 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:31 crc kubenswrapper[4742]: I0317 11:36:31.915080 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-dt4p6"] Mar 17 11:36:31 crc kubenswrapper[4742]: I0317 11:36:31.915472 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" podUID="ab1d5568-a84d-4397-b93c-6b997192fb30" containerName="dnsmasq-dns" containerID="cri-o://9fa46cb0281903ea2740e889c78f20f753154a8df89b1ea8118a440585c6bd72" gracePeriod=10 Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.140962 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-z42hc"] Mar 17 11:36:32 crc kubenswrapper[4742]: E0317 11:36:32.141407 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2339ae01-ee38-4d70-a94c-6fab6a31226e" containerName="extract-content" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.141431 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="2339ae01-ee38-4d70-a94c-6fab6a31226e" containerName="extract-content" Mar 17 11:36:32 crc kubenswrapper[4742]: E0317 11:36:32.141440 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2339ae01-ee38-4d70-a94c-6fab6a31226e" containerName="extract-utilities" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.141449 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="2339ae01-ee38-4d70-a94c-6fab6a31226e" containerName="extract-utilities" Mar 17 11:36:32 crc kubenswrapper[4742]: E0317 11:36:32.141492 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2339ae01-ee38-4d70-a94c-6fab6a31226e" containerName="registry-server" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.141499 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="2339ae01-ee38-4d70-a94c-6fab6a31226e" containerName="registry-server" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.141691 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="2339ae01-ee38-4d70-a94c-6fab6a31226e" containerName="registry-server" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.144395 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.154628 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-z42hc"] Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.281121 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.281440 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.281498 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5prk\" (UniqueName: \"kubernetes.io/projected/d3035223-2765-4ce8-ac14-f53ffcca7a1b-kube-api-access-r5prk\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.281546 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-dns-svc\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.281575 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-config\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.281607 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.281629 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.382783 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-dns-svc\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.382836 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-config\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.382868 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.382895 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.382946 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.382991 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.383041 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5prk\" (UniqueName: \"kubernetes.io/projected/d3035223-2765-4ce8-ac14-f53ffcca7a1b-kube-api-access-r5prk\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.384272 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.384317 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.384330 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.384418 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-config\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.384560 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-dns-svc\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.384808 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d3035223-2765-4ce8-ac14-f53ffcca7a1b-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.411170 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5prk\" (UniqueName: \"kubernetes.io/projected/d3035223-2765-4ce8-ac14-f53ffcca7a1b-kube-api-access-r5prk\") pod \"dnsmasq-dns-55478c4467-z42hc\" (UID: \"d3035223-2765-4ce8-ac14-f53ffcca7a1b\") " pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.463897 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.539363 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.692039 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-ovsdbserver-sb\") pod \"ab1d5568-a84d-4397-b93c-6b997192fb30\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.692520 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-config\") pod \"ab1d5568-a84d-4397-b93c-6b997192fb30\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.692867 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qhdj\" (UniqueName: \"kubernetes.io/projected/ab1d5568-a84d-4397-b93c-6b997192fb30-kube-api-access-6qhdj\") pod \"ab1d5568-a84d-4397-b93c-6b997192fb30\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.692991 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-ovsdbserver-nb\") pod \"ab1d5568-a84d-4397-b93c-6b997192fb30\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.693017 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-dns-svc\") pod \"ab1d5568-a84d-4397-b93c-6b997192fb30\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.693038 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-dns-swift-storage-0\") pod \"ab1d5568-a84d-4397-b93c-6b997192fb30\" (UID: \"ab1d5568-a84d-4397-b93c-6b997192fb30\") " Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.701442 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab1d5568-a84d-4397-b93c-6b997192fb30-kube-api-access-6qhdj" (OuterVolumeSpecName: "kube-api-access-6qhdj") pod "ab1d5568-a84d-4397-b93c-6b997192fb30" (UID: "ab1d5568-a84d-4397-b93c-6b997192fb30"). InnerVolumeSpecName "kube-api-access-6qhdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.753756 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ab1d5568-a84d-4397-b93c-6b997192fb30" (UID: "ab1d5568-a84d-4397-b93c-6b997192fb30"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.759587 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-config" (OuterVolumeSpecName: "config") pod "ab1d5568-a84d-4397-b93c-6b997192fb30" (UID: "ab1d5568-a84d-4397-b93c-6b997192fb30"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.762428 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ab1d5568-a84d-4397-b93c-6b997192fb30" (UID: "ab1d5568-a84d-4397-b93c-6b997192fb30"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.769406 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ab1d5568-a84d-4397-b93c-6b997192fb30" (UID: "ab1d5568-a84d-4397-b93c-6b997192fb30"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.779986 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ab1d5568-a84d-4397-b93c-6b997192fb30" (UID: "ab1d5568-a84d-4397-b93c-6b997192fb30"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.796441 4742 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.796496 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.796509 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.796518 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qhdj\" (UniqueName: \"kubernetes.io/projected/ab1d5568-a84d-4397-b93c-6b997192fb30-kube-api-access-6qhdj\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.796529 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.796538 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab1d5568-a84d-4397-b93c-6b997192fb30-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.885304 4742 generic.go:334] "Generic (PLEG): container finished" podID="ab1d5568-a84d-4397-b93c-6b997192fb30" containerID="9fa46cb0281903ea2740e889c78f20f753154a8df89b1ea8118a440585c6bd72" exitCode=0 Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.885348 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" event={"ID":"ab1d5568-a84d-4397-b93c-6b997192fb30","Type":"ContainerDied","Data":"9fa46cb0281903ea2740e889c78f20f753154a8df89b1ea8118a440585c6bd72"} Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.885383 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" event={"ID":"ab1d5568-a84d-4397-b93c-6b997192fb30","Type":"ContainerDied","Data":"0ce4b0bb2effdc02bb36ab1633384d667ff048fb9218b3beaccb3b4d1e5c5b9a"} Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.885403 4742 scope.go:117] "RemoveContainer" containerID="9fa46cb0281903ea2740e889c78f20f753154a8df89b1ea8118a440585c6bd72" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.885446 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-dt4p6" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.928312 4742 scope.go:117] "RemoveContainer" containerID="09caa62f03751fee6040912acc017bda1b9300662caec89f6cadc815b78a79b6" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.931880 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-dt4p6"] Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.946099 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-dt4p6"] Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.961383 4742 scope.go:117] "RemoveContainer" containerID="9fa46cb0281903ea2740e889c78f20f753154a8df89b1ea8118a440585c6bd72" Mar 17 11:36:32 crc kubenswrapper[4742]: E0317 11:36:32.961808 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fa46cb0281903ea2740e889c78f20f753154a8df89b1ea8118a440585c6bd72\": container with ID starting with 9fa46cb0281903ea2740e889c78f20f753154a8df89b1ea8118a440585c6bd72 not found: ID does not exist" containerID="9fa46cb0281903ea2740e889c78f20f753154a8df89b1ea8118a440585c6bd72" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.961849 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa46cb0281903ea2740e889c78f20f753154a8df89b1ea8118a440585c6bd72"} err="failed to get container status \"9fa46cb0281903ea2740e889c78f20f753154a8df89b1ea8118a440585c6bd72\": rpc error: code = NotFound desc = could not find container \"9fa46cb0281903ea2740e889c78f20f753154a8df89b1ea8118a440585c6bd72\": container with ID starting with 9fa46cb0281903ea2740e889c78f20f753154a8df89b1ea8118a440585c6bd72 not found: ID does not exist" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.961879 4742 scope.go:117] "RemoveContainer" containerID="09caa62f03751fee6040912acc017bda1b9300662caec89f6cadc815b78a79b6" Mar 17 11:36:32 crc kubenswrapper[4742]: E0317 11:36:32.962242 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09caa62f03751fee6040912acc017bda1b9300662caec89f6cadc815b78a79b6\": container with ID starting with 09caa62f03751fee6040912acc017bda1b9300662caec89f6cadc815b78a79b6 not found: ID does not exist" containerID="09caa62f03751fee6040912acc017bda1b9300662caec89f6cadc815b78a79b6" Mar 17 11:36:32 crc kubenswrapper[4742]: I0317 11:36:32.962274 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09caa62f03751fee6040912acc017bda1b9300662caec89f6cadc815b78a79b6"} err="failed to get container status \"09caa62f03751fee6040912acc017bda1b9300662caec89f6cadc815b78a79b6\": rpc error: code = NotFound desc = could not find container \"09caa62f03751fee6040912acc017bda1b9300662caec89f6cadc815b78a79b6\": container with ID starting with 09caa62f03751fee6040912acc017bda1b9300662caec89f6cadc815b78a79b6 not found: ID does not exist" Mar 17 11:36:33 crc kubenswrapper[4742]: I0317 11:36:33.015576 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-z42hc"] Mar 17 11:36:33 crc kubenswrapper[4742]: W0317 11:36:33.021145 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3035223_2765_4ce8_ac14_f53ffcca7a1b.slice/crio-6e55bae59e986834d9539994f86a7f9b4c34a25f7bc0e6a1a07ad0a98b2b81c9 WatchSource:0}: Error finding container 6e55bae59e986834d9539994f86a7f9b4c34a25f7bc0e6a1a07ad0a98b2b81c9: Status 404 returned error can't find the container with id 6e55bae59e986834d9539994f86a7f9b4c34a25f7bc0e6a1a07ad0a98b2b81c9 Mar 17 11:36:33 crc kubenswrapper[4742]: I0317 11:36:33.901741 4742 generic.go:334] "Generic (PLEG): container finished" podID="d3035223-2765-4ce8-ac14-f53ffcca7a1b" containerID="4f3a868af14841559f76f8ad875592740cb07c9a35db318be6b81cf3318e6dc8" exitCode=0 Mar 17 11:36:33 crc kubenswrapper[4742]: I0317 11:36:33.901859 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-z42hc" event={"ID":"d3035223-2765-4ce8-ac14-f53ffcca7a1b","Type":"ContainerDied","Data":"4f3a868af14841559f76f8ad875592740cb07c9a35db318be6b81cf3318e6dc8"} Mar 17 11:36:33 crc kubenswrapper[4742]: I0317 11:36:33.902281 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-z42hc" event={"ID":"d3035223-2765-4ce8-ac14-f53ffcca7a1b","Type":"ContainerStarted","Data":"6e55bae59e986834d9539994f86a7f9b4c34a25f7bc0e6a1a07ad0a98b2b81c9"} Mar 17 11:36:34 crc kubenswrapper[4742]: I0317 11:36:34.694502 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab1d5568-a84d-4397-b93c-6b997192fb30" path="/var/lib/kubelet/pods/ab1d5568-a84d-4397-b93c-6b997192fb30/volumes" Mar 17 11:36:34 crc kubenswrapper[4742]: I0317 11:36:34.915704 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-z42hc" event={"ID":"d3035223-2765-4ce8-ac14-f53ffcca7a1b","Type":"ContainerStarted","Data":"9cd1e34fab0c2ff554f2a4d84844788a8153c96c833eff81ff518683e4051521"} Mar 17 11:36:34 crc kubenswrapper[4742]: I0317 11:36:34.915935 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:34 crc kubenswrapper[4742]: I0317 11:36:34.938176 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-z42hc" podStartSLOduration=2.93815449 podStartE2EDuration="2.93815449s" podCreationTimestamp="2026-03-17 11:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:36:34.933477079 +0000 UTC m=+1498.059604847" watchObservedRunningTime="2026-03-17 11:36:34.93815449 +0000 UTC m=+1498.064282258" Mar 17 11:36:35 crc kubenswrapper[4742]: I0317 11:36:35.954738 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:36 crc kubenswrapper[4742]: I0317 11:36:36.015636 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:36 crc kubenswrapper[4742]: I0317 11:36:36.200469 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rhqg8"] Mar 17 11:36:37 crc kubenswrapper[4742]: I0317 11:36:37.950440 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rhqg8" podUID="cac34fc1-9c6c-4ffb-a772-87e33f70a856" containerName="registry-server" containerID="cri-o://41a3eda8f97221db7de1900157040679f7b22e52083e55fea7fe5f48089b2649" gracePeriod=2 Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.450650 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.525335 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfb5q\" (UniqueName: \"kubernetes.io/projected/cac34fc1-9c6c-4ffb-a772-87e33f70a856-kube-api-access-lfb5q\") pod \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\" (UID: \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\") " Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.525444 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac34fc1-9c6c-4ffb-a772-87e33f70a856-catalog-content\") pod \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\" (UID: \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\") " Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.525499 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac34fc1-9c6c-4ffb-a772-87e33f70a856-utilities\") pod \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\" (UID: \"cac34fc1-9c6c-4ffb-a772-87e33f70a856\") " Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.526832 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cac34fc1-9c6c-4ffb-a772-87e33f70a856-utilities" (OuterVolumeSpecName: "utilities") pod "cac34fc1-9c6c-4ffb-a772-87e33f70a856" (UID: "cac34fc1-9c6c-4ffb-a772-87e33f70a856"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.532973 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac34fc1-9c6c-4ffb-a772-87e33f70a856-kube-api-access-lfb5q" (OuterVolumeSpecName: "kube-api-access-lfb5q") pod "cac34fc1-9c6c-4ffb-a772-87e33f70a856" (UID: "cac34fc1-9c6c-4ffb-a772-87e33f70a856"). InnerVolumeSpecName "kube-api-access-lfb5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.628381 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac34fc1-9c6c-4ffb-a772-87e33f70a856-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.628674 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfb5q\" (UniqueName: \"kubernetes.io/projected/cac34fc1-9c6c-4ffb-a772-87e33f70a856-kube-api-access-lfb5q\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.671091 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cac34fc1-9c6c-4ffb-a772-87e33f70a856-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cac34fc1-9c6c-4ffb-a772-87e33f70a856" (UID: "cac34fc1-9c6c-4ffb-a772-87e33f70a856"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.736784 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac34fc1-9c6c-4ffb-a772-87e33f70a856-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.959363 4742 generic.go:334] "Generic (PLEG): container finished" podID="cac34fc1-9c6c-4ffb-a772-87e33f70a856" containerID="41a3eda8f97221db7de1900157040679f7b22e52083e55fea7fe5f48089b2649" exitCode=0 Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.959476 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhqg8" Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.959488 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhqg8" event={"ID":"cac34fc1-9c6c-4ffb-a772-87e33f70a856","Type":"ContainerDied","Data":"41a3eda8f97221db7de1900157040679f7b22e52083e55fea7fe5f48089b2649"} Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.959565 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhqg8" event={"ID":"cac34fc1-9c6c-4ffb-a772-87e33f70a856","Type":"ContainerDied","Data":"537df5638144c7ede68b3ed04b22d0720b1348daf9ea49492c03a29d479bf53d"} Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.959589 4742 scope.go:117] "RemoveContainer" containerID="41a3eda8f97221db7de1900157040679f7b22e52083e55fea7fe5f48089b2649" Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.985055 4742 scope.go:117] "RemoveContainer" containerID="d403db49dbbfc88e0bff73873ce3567162423cef33907fbf81ab163ad9f266d5" Mar 17 11:36:38 crc kubenswrapper[4742]: I0317 11:36:38.995445 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rhqg8"] Mar 17 11:36:39 crc kubenswrapper[4742]: I0317 11:36:39.007531 4742 scope.go:117] "RemoveContainer" containerID="5ebba01f69ca93513bf85116ccc2cc17fb509c39b51b3c9c9c44b72da6402b05" Mar 17 11:36:39 crc kubenswrapper[4742]: I0317 11:36:39.010362 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rhqg8"] Mar 17 11:36:39 crc kubenswrapper[4742]: I0317 11:36:39.054107 4742 scope.go:117] "RemoveContainer" containerID="41a3eda8f97221db7de1900157040679f7b22e52083e55fea7fe5f48089b2649" Mar 17 11:36:39 crc kubenswrapper[4742]: E0317 11:36:39.054604 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41a3eda8f97221db7de1900157040679f7b22e52083e55fea7fe5f48089b2649\": container with ID starting with 41a3eda8f97221db7de1900157040679f7b22e52083e55fea7fe5f48089b2649 not found: ID does not exist" containerID="41a3eda8f97221db7de1900157040679f7b22e52083e55fea7fe5f48089b2649" Mar 17 11:36:39 crc kubenswrapper[4742]: I0317 11:36:39.054637 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41a3eda8f97221db7de1900157040679f7b22e52083e55fea7fe5f48089b2649"} err="failed to get container status \"41a3eda8f97221db7de1900157040679f7b22e52083e55fea7fe5f48089b2649\": rpc error: code = NotFound desc = could not find container \"41a3eda8f97221db7de1900157040679f7b22e52083e55fea7fe5f48089b2649\": container with ID starting with 41a3eda8f97221db7de1900157040679f7b22e52083e55fea7fe5f48089b2649 not found: ID does not exist" Mar 17 11:36:39 crc kubenswrapper[4742]: I0317 11:36:39.054657 4742 scope.go:117] "RemoveContainer" containerID="d403db49dbbfc88e0bff73873ce3567162423cef33907fbf81ab163ad9f266d5" Mar 17 11:36:39 crc kubenswrapper[4742]: E0317 11:36:39.054885 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d403db49dbbfc88e0bff73873ce3567162423cef33907fbf81ab163ad9f266d5\": container with ID starting with d403db49dbbfc88e0bff73873ce3567162423cef33907fbf81ab163ad9f266d5 not found: ID does not exist" containerID="d403db49dbbfc88e0bff73873ce3567162423cef33907fbf81ab163ad9f266d5" Mar 17 11:36:39 crc kubenswrapper[4742]: I0317 11:36:39.054997 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d403db49dbbfc88e0bff73873ce3567162423cef33907fbf81ab163ad9f266d5"} err="failed to get container status \"d403db49dbbfc88e0bff73873ce3567162423cef33907fbf81ab163ad9f266d5\": rpc error: code = NotFound desc = could not find container \"d403db49dbbfc88e0bff73873ce3567162423cef33907fbf81ab163ad9f266d5\": container with ID starting with d403db49dbbfc88e0bff73873ce3567162423cef33907fbf81ab163ad9f266d5 not found: ID does not exist" Mar 17 11:36:39 crc kubenswrapper[4742]: I0317 11:36:39.055078 4742 scope.go:117] "RemoveContainer" containerID="5ebba01f69ca93513bf85116ccc2cc17fb509c39b51b3c9c9c44b72da6402b05" Mar 17 11:36:39 crc kubenswrapper[4742]: E0317 11:36:39.055348 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ebba01f69ca93513bf85116ccc2cc17fb509c39b51b3c9c9c44b72da6402b05\": container with ID starting with 5ebba01f69ca93513bf85116ccc2cc17fb509c39b51b3c9c9c44b72da6402b05 not found: ID does not exist" containerID="5ebba01f69ca93513bf85116ccc2cc17fb509c39b51b3c9c9c44b72da6402b05" Mar 17 11:36:39 crc kubenswrapper[4742]: I0317 11:36:39.055368 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ebba01f69ca93513bf85116ccc2cc17fb509c39b51b3c9c9c44b72da6402b05"} err="failed to get container status \"5ebba01f69ca93513bf85116ccc2cc17fb509c39b51b3c9c9c44b72da6402b05\": rpc error: code = NotFound desc = could not find container \"5ebba01f69ca93513bf85116ccc2cc17fb509c39b51b3c9c9c44b72da6402b05\": container with ID starting with 5ebba01f69ca93513bf85116ccc2cc17fb509c39b51b3c9c9c44b72da6402b05 not found: ID does not exist" Mar 17 11:36:40 crc kubenswrapper[4742]: I0317 11:36:40.676717 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cac34fc1-9c6c-4ffb-a772-87e33f70a856" path="/var/lib/kubelet/pods/cac34fc1-9c6c-4ffb-a772-87e33f70a856/volumes" Mar 17 11:36:41 crc kubenswrapper[4742]: E0317 11:36:41.301871 4742 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac34fc1_9c6c_4ffb_a772_87e33f70a856.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac34fc1_9c6c_4ffb_a772_87e33f70a856.slice/crio-537df5638144c7ede68b3ed04b22d0720b1348daf9ea49492c03a29d479bf53d\": RecentStats: unable to find data in memory cache]" Mar 17 11:36:42 crc kubenswrapper[4742]: I0317 11:36:42.465184 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-z42hc" Mar 17 11:36:42 crc kubenswrapper[4742]: I0317 11:36:42.564522 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-phn6z"] Mar 17 11:36:42 crc kubenswrapper[4742]: I0317 11:36:42.565090 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" podUID="204902d3-4729-4f78-ba39-d5495676a514" containerName="dnsmasq-dns" containerID="cri-o://5c6bfb504b711fa3c0c16f626be67dc1479ce3838de3a9caaae00e79da969e41" gracePeriod=10 Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.009770 4742 generic.go:334] "Generic (PLEG): container finished" podID="204902d3-4729-4f78-ba39-d5495676a514" containerID="5c6bfb504b711fa3c0c16f626be67dc1479ce3838de3a9caaae00e79da969e41" exitCode=0 Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.009814 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" event={"ID":"204902d3-4729-4f78-ba39-d5495676a514","Type":"ContainerDied","Data":"5c6bfb504b711fa3c0c16f626be67dc1479ce3838de3a9caaae00e79da969e41"} Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.009857 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" event={"ID":"204902d3-4729-4f78-ba39-d5495676a514","Type":"ContainerDied","Data":"365711c795d727754be779f1c5c4aa0ac401ad7790ec2326701fb86ca47ba29d"} Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.009869 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="365711c795d727754be779f1c5c4aa0ac401ad7790ec2326701fb86ca47ba29d" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.043800 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.122489 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-config\") pod \"204902d3-4729-4f78-ba39-d5495676a514\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.122590 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-dns-swift-storage-0\") pod \"204902d3-4729-4f78-ba39-d5495676a514\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.122675 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-ovsdbserver-sb\") pod \"204902d3-4729-4f78-ba39-d5495676a514\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.122707 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmr5n\" (UniqueName: \"kubernetes.io/projected/204902d3-4729-4f78-ba39-d5495676a514-kube-api-access-dmr5n\") pod \"204902d3-4729-4f78-ba39-d5495676a514\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.122780 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-dns-svc\") pod \"204902d3-4729-4f78-ba39-d5495676a514\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.122853 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-ovsdbserver-nb\") pod \"204902d3-4729-4f78-ba39-d5495676a514\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.122880 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-openstack-edpm-ipam\") pod \"204902d3-4729-4f78-ba39-d5495676a514\" (UID: \"204902d3-4729-4f78-ba39-d5495676a514\") " Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.129315 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/204902d3-4729-4f78-ba39-d5495676a514-kube-api-access-dmr5n" (OuterVolumeSpecName: "kube-api-access-dmr5n") pod "204902d3-4729-4f78-ba39-d5495676a514" (UID: "204902d3-4729-4f78-ba39-d5495676a514"). InnerVolumeSpecName "kube-api-access-dmr5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.171007 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "204902d3-4729-4f78-ba39-d5495676a514" (UID: "204902d3-4729-4f78-ba39-d5495676a514"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.220550 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "204902d3-4729-4f78-ba39-d5495676a514" (UID: "204902d3-4729-4f78-ba39-d5495676a514"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.231448 4742 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.231477 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.231487 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmr5n\" (UniqueName: \"kubernetes.io/projected/204902d3-4729-4f78-ba39-d5495676a514-kube-api-access-dmr5n\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.240729 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-config" (OuterVolumeSpecName: "config") pod "204902d3-4729-4f78-ba39-d5495676a514" (UID: "204902d3-4729-4f78-ba39-d5495676a514"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.249375 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "204902d3-4729-4f78-ba39-d5495676a514" (UID: "204902d3-4729-4f78-ba39-d5495676a514"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.261380 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "204902d3-4729-4f78-ba39-d5495676a514" (UID: "204902d3-4729-4f78-ba39-d5495676a514"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.294555 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "204902d3-4729-4f78-ba39-d5495676a514" (UID: "204902d3-4729-4f78-ba39-d5495676a514"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.333055 4742 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.333090 4742 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.333099 4742 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:43 crc kubenswrapper[4742]: I0317 11:36:43.333108 4742 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204902d3-4729-4f78-ba39-d5495676a514-config\") on node \"crc\" DevicePath \"\"" Mar 17 11:36:44 crc kubenswrapper[4742]: I0317 11:36:44.018968 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-phn6z" Mar 17 11:36:44 crc kubenswrapper[4742]: I0317 11:36:44.057373 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-phn6z"] Mar 17 11:36:44 crc kubenswrapper[4742]: I0317 11:36:44.067044 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-phn6z"] Mar 17 11:36:44 crc kubenswrapper[4742]: I0317 11:36:44.681780 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="204902d3-4729-4f78-ba39-d5495676a514" path="/var/lib/kubelet/pods/204902d3-4729-4f78-ba39-d5495676a514/volumes" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.175292 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4"] Mar 17 11:36:51 crc kubenswrapper[4742]: E0317 11:36:51.176092 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1d5568-a84d-4397-b93c-6b997192fb30" containerName="dnsmasq-dns" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.176103 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1d5568-a84d-4397-b93c-6b997192fb30" containerName="dnsmasq-dns" Mar 17 11:36:51 crc kubenswrapper[4742]: E0317 11:36:51.176123 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab1d5568-a84d-4397-b93c-6b997192fb30" containerName="init" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.176129 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab1d5568-a84d-4397-b93c-6b997192fb30" containerName="init" Mar 17 11:36:51 crc kubenswrapper[4742]: E0317 11:36:51.176138 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac34fc1-9c6c-4ffb-a772-87e33f70a856" containerName="extract-content" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.176144 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac34fc1-9c6c-4ffb-a772-87e33f70a856" containerName="extract-content" Mar 17 11:36:51 crc kubenswrapper[4742]: E0317 11:36:51.176158 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac34fc1-9c6c-4ffb-a772-87e33f70a856" containerName="extract-utilities" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.176164 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac34fc1-9c6c-4ffb-a772-87e33f70a856" containerName="extract-utilities" Mar 17 11:36:51 crc kubenswrapper[4742]: E0317 11:36:51.176170 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac34fc1-9c6c-4ffb-a772-87e33f70a856" containerName="registry-server" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.176176 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac34fc1-9c6c-4ffb-a772-87e33f70a856" containerName="registry-server" Mar 17 11:36:51 crc kubenswrapper[4742]: E0317 11:36:51.176192 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204902d3-4729-4f78-ba39-d5495676a514" containerName="init" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.176197 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="204902d3-4729-4f78-ba39-d5495676a514" containerName="init" Mar 17 11:36:51 crc kubenswrapper[4742]: E0317 11:36:51.176212 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204902d3-4729-4f78-ba39-d5495676a514" containerName="dnsmasq-dns" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.176217 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="204902d3-4729-4f78-ba39-d5495676a514" containerName="dnsmasq-dns" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.176376 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab1d5568-a84d-4397-b93c-6b997192fb30" containerName="dnsmasq-dns" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.176388 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac34fc1-9c6c-4ffb-a772-87e33f70a856" containerName="registry-server" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.176401 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="204902d3-4729-4f78-ba39-d5495676a514" containerName="dnsmasq-dns" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.176985 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.179376 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.179391 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.179499 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.179718 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.188695 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4"] Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.291609 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.291711 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zntx\" (UniqueName: \"kubernetes.io/projected/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-kube-api-access-6zntx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.291778 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.291841 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.393394 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.393457 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.393558 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.393591 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zntx\" (UniqueName: \"kubernetes.io/projected/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-kube-api-access-6zntx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.400374 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.400545 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.400669 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.416723 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zntx\" (UniqueName: \"kubernetes.io/projected/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-kube-api-access-6zntx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: I0317 11:36:51.494515 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:36:51 crc kubenswrapper[4742]: E0317 11:36:51.534729 4742 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac34fc1_9c6c_4ffb_a772_87e33f70a856.slice/crio-537df5638144c7ede68b3ed04b22d0720b1348daf9ea49492c03a29d479bf53d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac34fc1_9c6c_4ffb_a772_87e33f70a856.slice\": RecentStats: unable to find data in memory cache]" Mar 17 11:36:52 crc kubenswrapper[4742]: I0317 11:36:52.061604 4742 scope.go:117] "RemoveContainer" containerID="ece0f4a648dedc1d926b58223b995f25c2021607970bd4c3840ad15871418b48" Mar 17 11:36:52 crc kubenswrapper[4742]: I0317 11:36:52.095402 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4"] Mar 17 11:36:52 crc kubenswrapper[4742]: W0317 11:36:52.121639 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabeb089b_7a3b_4ab5_b412_f6d7b7fd0c7f.slice/crio-813dd91ae11f3b3b60f3dcbfbea8d2c2bf783693212daa84b4d88f68d55ab276 WatchSource:0}: Error finding container 813dd91ae11f3b3b60f3dcbfbea8d2c2bf783693212daa84b4d88f68d55ab276: Status 404 returned error can't find the container with id 813dd91ae11f3b3b60f3dcbfbea8d2c2bf783693212daa84b4d88f68d55ab276 Mar 17 11:36:53 crc kubenswrapper[4742]: I0317 11:36:53.107570 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" event={"ID":"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f","Type":"ContainerStarted","Data":"813dd91ae11f3b3b60f3dcbfbea8d2c2bf783693212daa84b4d88f68d55ab276"} Mar 17 11:36:57 crc kubenswrapper[4742]: I0317 11:36:57.159177 4742 generic.go:334] "Generic (PLEG): container finished" podID="4e8c9887-8315-444e-b3dd-9753e83f83fa" containerID="ecf1386a52ef8ac6d953a6b6c75367f0adf70e1490942177d2c4dcbd922f8051" exitCode=0 Mar 17 11:36:57 crc kubenswrapper[4742]: I0317 11:36:57.159374 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4e8c9887-8315-444e-b3dd-9753e83f83fa","Type":"ContainerDied","Data":"ecf1386a52ef8ac6d953a6b6c75367f0adf70e1490942177d2c4dcbd922f8051"} Mar 17 11:36:59 crc kubenswrapper[4742]: I0317 11:36:59.208503 4742 generic.go:334] "Generic (PLEG): container finished" podID="6c10e471-26c3-41ec-bf47-a5edf33c173d" containerID="61ad8c14db28965b0e7ec192ccbd1132aef249cfd1e5e01434a859b98cc2f305" exitCode=0 Mar 17 11:36:59 crc kubenswrapper[4742]: I0317 11:36:59.208617 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c10e471-26c3-41ec-bf47-a5edf33c173d","Type":"ContainerDied","Data":"61ad8c14db28965b0e7ec192ccbd1132aef249cfd1e5e01434a859b98cc2f305"} Mar 17 11:37:01 crc kubenswrapper[4742]: E0317 11:37:01.825108 4742 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac34fc1_9c6c_4ffb_a772_87e33f70a856.slice/crio-537df5638144c7ede68b3ed04b22d0720b1348daf9ea49492c03a29d479bf53d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac34fc1_9c6c_4ffb_a772_87e33f70a856.slice\": RecentStats: unable to find data in memory cache]" Mar 17 11:37:02 crc kubenswrapper[4742]: I0317 11:37:02.244663 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6c10e471-26c3-41ec-bf47-a5edf33c173d","Type":"ContainerStarted","Data":"2dfe4b611f7cee02c941ffa4f651ba86f6988863c1a63b476e3e09cb8b2bd129"} Mar 17 11:37:02 crc kubenswrapper[4742]: I0317 11:37:02.246202 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:37:02 crc kubenswrapper[4742]: I0317 11:37:02.272251 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4e8c9887-8315-444e-b3dd-9753e83f83fa","Type":"ContainerStarted","Data":"6ccc8e4f9f4cd59ac4fc4e7efa23ca58f6ad29f66c78333918f3c5c8ceab5440"} Mar 17 11:37:02 crc kubenswrapper[4742]: I0317 11:37:02.273231 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 17 11:37:02 crc kubenswrapper[4742]: I0317 11:37:02.291671 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" event={"ID":"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f","Type":"ContainerStarted","Data":"4ba934ec349f500ca2be2847d00d795d988c8b1ed2e4d2e7204a967a5cf4e574"} Mar 17 11:37:02 crc kubenswrapper[4742]: I0317 11:37:02.294460 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.294446007 podStartE2EDuration="40.294446007s" podCreationTimestamp="2026-03-17 11:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:37:02.291442793 +0000 UTC m=+1525.417570581" watchObservedRunningTime="2026-03-17 11:37:02.294446007 +0000 UTC m=+1525.420573765" Mar 17 11:37:02 crc kubenswrapper[4742]: I0317 11:37:02.322860 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=41.322843318 podStartE2EDuration="41.322843318s" podCreationTimestamp="2026-03-17 11:36:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 11:37:02.321852301 +0000 UTC m=+1525.447980059" watchObservedRunningTime="2026-03-17 11:37:02.322843318 +0000 UTC m=+1525.448971076" Mar 17 11:37:02 crc kubenswrapper[4742]: I0317 11:37:02.368951 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" podStartSLOduration=1.793418988 podStartE2EDuration="11.368928423s" podCreationTimestamp="2026-03-17 11:36:51 +0000 UTC" firstStartedPulling="2026-03-17 11:36:52.133352368 +0000 UTC m=+1515.259480126" lastFinishedPulling="2026-03-17 11:37:01.708861803 +0000 UTC m=+1524.834989561" observedRunningTime="2026-03-17 11:37:02.364422047 +0000 UTC m=+1525.490549825" watchObservedRunningTime="2026-03-17 11:37:02.368928423 +0000 UTC m=+1525.495056191" Mar 17 11:37:12 crc kubenswrapper[4742]: E0317 11:37:12.058678 4742 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac34fc1_9c6c_4ffb_a772_87e33f70a856.slice/crio-537df5638144c7ede68b3ed04b22d0720b1348daf9ea49492c03a29d479bf53d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac34fc1_9c6c_4ffb_a772_87e33f70a856.slice\": RecentStats: unable to find data in memory cache]" Mar 17 11:37:12 crc kubenswrapper[4742]: I0317 11:37:12.135185 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 17 11:37:13 crc kubenswrapper[4742]: I0317 11:37:13.164206 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 17 11:37:13 crc kubenswrapper[4742]: I0317 11:37:13.415074 4742 generic.go:334] "Generic (PLEG): container finished" podID="abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f" containerID="4ba934ec349f500ca2be2847d00d795d988c8b1ed2e4d2e7204a967a5cf4e574" exitCode=0 Mar 17 11:37:13 crc kubenswrapper[4742]: I0317 11:37:13.415331 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" event={"ID":"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f","Type":"ContainerDied","Data":"4ba934ec349f500ca2be2847d00d795d988c8b1ed2e4d2e7204a967a5cf4e574"} Mar 17 11:37:14 crc kubenswrapper[4742]: I0317 11:37:14.847386 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:37:14 crc kubenswrapper[4742]: I0317 11:37:14.997721 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-inventory\") pod \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " Mar 17 11:37:14 crc kubenswrapper[4742]: I0317 11:37:14.997841 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-ssh-key-openstack-edpm-ipam\") pod \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " Mar 17 11:37:14 crc kubenswrapper[4742]: I0317 11:37:14.997889 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zntx\" (UniqueName: \"kubernetes.io/projected/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-kube-api-access-6zntx\") pod \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " Mar 17 11:37:14 crc kubenswrapper[4742]: I0317 11:37:14.997961 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-repo-setup-combined-ca-bundle\") pod \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\" (UID: \"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f\") " Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.003125 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f" (UID: "abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.003765 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-kube-api-access-6zntx" (OuterVolumeSpecName: "kube-api-access-6zntx") pod "abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f" (UID: "abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f"). InnerVolumeSpecName "kube-api-access-6zntx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.041675 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f" (UID: "abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.048519 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-inventory" (OuterVolumeSpecName: "inventory") pod "abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f" (UID: "abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.099878 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.099937 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.099951 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zntx\" (UniqueName: \"kubernetes.io/projected/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-kube-api-access-6zntx\") on node \"crc\" DevicePath \"\"" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.099962 4742 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.436557 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" event={"ID":"abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f","Type":"ContainerDied","Data":"813dd91ae11f3b3b60f3dcbfbea8d2c2bf783693212daa84b4d88f68d55ab276"} Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.436598 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="813dd91ae11f3b3b60f3dcbfbea8d2c2bf783693212daa84b4d88f68d55ab276" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.436661 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.531657 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6"] Mar 17 11:37:15 crc kubenswrapper[4742]: E0317 11:37:15.532103 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.532125 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.532349 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.533166 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.535192 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.535598 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.539504 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.541893 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.544820 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6"] Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.608995 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmkm9\" (UniqueName: \"kubernetes.io/projected/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-kube-api-access-vmkm9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l8dw6\" (UID: \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.609200 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l8dw6\" (UID: \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.609331 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l8dw6\" (UID: \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.711630 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmkm9\" (UniqueName: \"kubernetes.io/projected/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-kube-api-access-vmkm9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l8dw6\" (UID: \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.711804 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l8dw6\" (UID: \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.711893 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l8dw6\" (UID: \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.716190 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l8dw6\" (UID: \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.718163 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l8dw6\" (UID: \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.727398 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmkm9\" (UniqueName: \"kubernetes.io/projected/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-kube-api-access-vmkm9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l8dw6\" (UID: \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" Mar 17 11:37:15 crc kubenswrapper[4742]: I0317 11:37:15.865203 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" Mar 17 11:37:16 crc kubenswrapper[4742]: I0317 11:37:16.375388 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6"] Mar 17 11:37:16 crc kubenswrapper[4742]: I0317 11:37:16.445379 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" event={"ID":"529b4c5a-8be2-4820-b06a-11eb75c3dc3b","Type":"ContainerStarted","Data":"46b2b7a8657d09b6d4a02ed799f86f5d9e269054197622f3a6200f22d08f8c26"} Mar 17 11:37:18 crc kubenswrapper[4742]: I0317 11:37:18.044308 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:37:18 crc kubenswrapper[4742]: I0317 11:37:18.044801 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:37:18 crc kubenswrapper[4742]: I0317 11:37:18.475454 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" event={"ID":"529b4c5a-8be2-4820-b06a-11eb75c3dc3b","Type":"ContainerStarted","Data":"09df9405779289c316e7be307249a294dbec767afddce5fc90fcaf1e9a682499"} Mar 17 11:37:18 crc kubenswrapper[4742]: I0317 11:37:18.501993 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" podStartSLOduration=2.600600917 podStartE2EDuration="3.501968454s" podCreationTimestamp="2026-03-17 11:37:15 +0000 UTC" firstStartedPulling="2026-03-17 11:37:16.368714328 +0000 UTC m=+1539.494842086" lastFinishedPulling="2026-03-17 11:37:17.270081825 +0000 UTC m=+1540.396209623" observedRunningTime="2026-03-17 11:37:18.497480569 +0000 UTC m=+1541.623608337" watchObservedRunningTime="2026-03-17 11:37:18.501968454 +0000 UTC m=+1541.628096212" Mar 17 11:37:20 crc kubenswrapper[4742]: I0317 11:37:20.502018 4742 generic.go:334] "Generic (PLEG): container finished" podID="529b4c5a-8be2-4820-b06a-11eb75c3dc3b" containerID="09df9405779289c316e7be307249a294dbec767afddce5fc90fcaf1e9a682499" exitCode=0 Mar 17 11:37:20 crc kubenswrapper[4742]: I0317 11:37:20.502150 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" event={"ID":"529b4c5a-8be2-4820-b06a-11eb75c3dc3b","Type":"ContainerDied","Data":"09df9405779289c316e7be307249a294dbec767afddce5fc90fcaf1e9a682499"} Mar 17 11:37:21 crc kubenswrapper[4742]: I0317 11:37:21.952612 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.043425 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-inventory\") pod \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\" (UID: \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\") " Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.043547 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmkm9\" (UniqueName: \"kubernetes.io/projected/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-kube-api-access-vmkm9\") pod \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\" (UID: \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\") " Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.043580 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-ssh-key-openstack-edpm-ipam\") pod \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\" (UID: \"529b4c5a-8be2-4820-b06a-11eb75c3dc3b\") " Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.051441 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-kube-api-access-vmkm9" (OuterVolumeSpecName: "kube-api-access-vmkm9") pod "529b4c5a-8be2-4820-b06a-11eb75c3dc3b" (UID: "529b4c5a-8be2-4820-b06a-11eb75c3dc3b"). InnerVolumeSpecName "kube-api-access-vmkm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.072045 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-inventory" (OuterVolumeSpecName: "inventory") pod "529b4c5a-8be2-4820-b06a-11eb75c3dc3b" (UID: "529b4c5a-8be2-4820-b06a-11eb75c3dc3b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.089570 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "529b4c5a-8be2-4820-b06a-11eb75c3dc3b" (UID: "529b4c5a-8be2-4820-b06a-11eb75c3dc3b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.146174 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.146218 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmkm9\" (UniqueName: \"kubernetes.io/projected/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-kube-api-access-vmkm9\") on node \"crc\" DevicePath \"\"" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.146233 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/529b4c5a-8be2-4820-b06a-11eb75c3dc3b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:37:22 crc kubenswrapper[4742]: E0317 11:37:22.322775 4742 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac34fc1_9c6c_4ffb_a772_87e33f70a856.slice/crio-537df5638144c7ede68b3ed04b22d0720b1348daf9ea49492c03a29d479bf53d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac34fc1_9c6c_4ffb_a772_87e33f70a856.slice\": RecentStats: unable to find data in memory cache]" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.536561 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" event={"ID":"529b4c5a-8be2-4820-b06a-11eb75c3dc3b","Type":"ContainerDied","Data":"46b2b7a8657d09b6d4a02ed799f86f5d9e269054197622f3a6200f22d08f8c26"} Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.536606 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46b2b7a8657d09b6d4a02ed799f86f5d9e269054197622f3a6200f22d08f8c26" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.536674 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l8dw6" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.624664 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc"] Mar 17 11:37:22 crc kubenswrapper[4742]: E0317 11:37:22.625717 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="529b4c5a-8be2-4820-b06a-11eb75c3dc3b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.625878 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="529b4c5a-8be2-4820-b06a-11eb75c3dc3b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.626260 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="529b4c5a-8be2-4820-b06a-11eb75c3dc3b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.627188 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.630202 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.630700 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.631070 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.635583 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.649085 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc"] Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.757371 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.757556 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cpjh\" (UniqueName: \"kubernetes.io/projected/e6bf81f0-73d3-4dde-937d-87bbea94c36e-kube-api-access-5cpjh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.757602 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.757670 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.860215 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cpjh\" (UniqueName: \"kubernetes.io/projected/e6bf81f0-73d3-4dde-937d-87bbea94c36e-kube-api-access-5cpjh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.860313 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.860397 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.860477 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.868895 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.868900 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.869279 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.884783 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cpjh\" (UniqueName: \"kubernetes.io/projected/e6bf81f0-73d3-4dde-937d-87bbea94c36e-kube-api-access-5cpjh\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:22 crc kubenswrapper[4742]: I0317 11:37:22.948965 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:37:23 crc kubenswrapper[4742]: I0317 11:37:23.559155 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc"] Mar 17 11:37:24 crc kubenswrapper[4742]: I0317 11:37:24.577273 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" event={"ID":"e6bf81f0-73d3-4dde-937d-87bbea94c36e","Type":"ContainerStarted","Data":"3e919279a68a16517086c2944a547a01a33d918ac6faff393c030aa60ace2ba5"} Mar 17 11:37:24 crc kubenswrapper[4742]: I0317 11:37:24.577750 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" event={"ID":"e6bf81f0-73d3-4dde-937d-87bbea94c36e","Type":"ContainerStarted","Data":"14c7e39d8287c10268c2cc432d93fa47bbb77f98a36dd12d9d9c8387761f113f"} Mar 17 11:37:24 crc kubenswrapper[4742]: I0317 11:37:24.616733 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" podStartSLOduration=2.091937178 podStartE2EDuration="2.616712567s" podCreationTimestamp="2026-03-17 11:37:22 +0000 UTC" firstStartedPulling="2026-03-17 11:37:23.577089476 +0000 UTC m=+1546.703217254" lastFinishedPulling="2026-03-17 11:37:24.101864885 +0000 UTC m=+1547.227992643" observedRunningTime="2026-03-17 11:37:24.603661403 +0000 UTC m=+1547.729789171" watchObservedRunningTime="2026-03-17 11:37:24.616712567 +0000 UTC m=+1547.742840335" Mar 17 11:37:32 crc kubenswrapper[4742]: E0317 11:37:32.572398 4742 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac34fc1_9c6c_4ffb_a772_87e33f70a856.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac34fc1_9c6c_4ffb_a772_87e33f70a856.slice/crio-537df5638144c7ede68b3ed04b22d0720b1348daf9ea49492c03a29d479bf53d\": RecentStats: unable to find data in memory cache]" Mar 17 11:37:48 crc kubenswrapper[4742]: I0317 11:37:48.043660 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:37:48 crc kubenswrapper[4742]: I0317 11:37:48.044231 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:37:52 crc kubenswrapper[4742]: I0317 11:37:52.222007 4742 scope.go:117] "RemoveContainer" containerID="b2e577c301d5b5bcddc3c97fd98c17d9bed76ad4ca32102a18078d5fafa190e6" Mar 17 11:37:52 crc kubenswrapper[4742]: I0317 11:37:52.249471 4742 scope.go:117] "RemoveContainer" containerID="19c851e05f0711c436c16a16a9aae644a7b365716b32c838639893b99dc976a6" Mar 17 11:37:52 crc kubenswrapper[4742]: I0317 11:37:52.359974 4742 scope.go:117] "RemoveContainer" containerID="132e6829f0471b35d024d8b51add4272475ab6913eef83c75aa27597272f3deb" Mar 17 11:38:00 crc kubenswrapper[4742]: I0317 11:38:00.177072 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562458-dphx7"] Mar 17 11:38:00 crc kubenswrapper[4742]: I0317 11:38:00.179390 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562458-dphx7" Mar 17 11:38:00 crc kubenswrapper[4742]: I0317 11:38:00.183190 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:38:00 crc kubenswrapper[4742]: I0317 11:38:00.183233 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:38:00 crc kubenswrapper[4742]: I0317 11:38:00.183444 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:38:00 crc kubenswrapper[4742]: I0317 11:38:00.185550 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562458-dphx7"] Mar 17 11:38:00 crc kubenswrapper[4742]: I0317 11:38:00.332811 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b99s2\" (UniqueName: \"kubernetes.io/projected/3a9fe8c0-2ce7-4fae-b112-a9778e1cee38-kube-api-access-b99s2\") pod \"auto-csr-approver-29562458-dphx7\" (UID: \"3a9fe8c0-2ce7-4fae-b112-a9778e1cee38\") " pod="openshift-infra/auto-csr-approver-29562458-dphx7" Mar 17 11:38:00 crc kubenswrapper[4742]: I0317 11:38:00.434717 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b99s2\" (UniqueName: \"kubernetes.io/projected/3a9fe8c0-2ce7-4fae-b112-a9778e1cee38-kube-api-access-b99s2\") pod \"auto-csr-approver-29562458-dphx7\" (UID: \"3a9fe8c0-2ce7-4fae-b112-a9778e1cee38\") " pod="openshift-infra/auto-csr-approver-29562458-dphx7" Mar 17 11:38:00 crc kubenswrapper[4742]: I0317 11:38:00.462517 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b99s2\" (UniqueName: \"kubernetes.io/projected/3a9fe8c0-2ce7-4fae-b112-a9778e1cee38-kube-api-access-b99s2\") pod \"auto-csr-approver-29562458-dphx7\" (UID: \"3a9fe8c0-2ce7-4fae-b112-a9778e1cee38\") " pod="openshift-infra/auto-csr-approver-29562458-dphx7" Mar 17 11:38:00 crc kubenswrapper[4742]: I0317 11:38:00.515869 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562458-dphx7" Mar 17 11:38:01 crc kubenswrapper[4742]: W0317 11:38:01.048227 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a9fe8c0_2ce7_4fae_b112_a9778e1cee38.slice/crio-f2d57f0b092ca2e4eeca171467c8c08aca23aa99c12dc707d8f95980884943e1 WatchSource:0}: Error finding container f2d57f0b092ca2e4eeca171467c8c08aca23aa99c12dc707d8f95980884943e1: Status 404 returned error can't find the container with id f2d57f0b092ca2e4eeca171467c8c08aca23aa99c12dc707d8f95980884943e1 Mar 17 11:38:01 crc kubenswrapper[4742]: I0317 11:38:01.049281 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562458-dphx7"] Mar 17 11:38:01 crc kubenswrapper[4742]: I0317 11:38:01.051735 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 11:38:01 crc kubenswrapper[4742]: I0317 11:38:01.240822 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562458-dphx7" event={"ID":"3a9fe8c0-2ce7-4fae-b112-a9778e1cee38","Type":"ContainerStarted","Data":"f2d57f0b092ca2e4eeca171467c8c08aca23aa99c12dc707d8f95980884943e1"} Mar 17 11:38:03 crc kubenswrapper[4742]: I0317 11:38:03.272290 4742 generic.go:334] "Generic (PLEG): container finished" podID="3a9fe8c0-2ce7-4fae-b112-a9778e1cee38" containerID="d91483f1c643cbf53ffaab71a766e519070f4b7ff3910bee60a43a9b148efa77" exitCode=0 Mar 17 11:38:03 crc kubenswrapper[4742]: I0317 11:38:03.272451 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562458-dphx7" event={"ID":"3a9fe8c0-2ce7-4fae-b112-a9778e1cee38","Type":"ContainerDied","Data":"d91483f1c643cbf53ffaab71a766e519070f4b7ff3910bee60a43a9b148efa77"} Mar 17 11:38:04 crc kubenswrapper[4742]: I0317 11:38:04.657593 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562458-dphx7" Mar 17 11:38:04 crc kubenswrapper[4742]: I0317 11:38:04.827093 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b99s2\" (UniqueName: \"kubernetes.io/projected/3a9fe8c0-2ce7-4fae-b112-a9778e1cee38-kube-api-access-b99s2\") pod \"3a9fe8c0-2ce7-4fae-b112-a9778e1cee38\" (UID: \"3a9fe8c0-2ce7-4fae-b112-a9778e1cee38\") " Mar 17 11:38:04 crc kubenswrapper[4742]: I0317 11:38:04.838299 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9fe8c0-2ce7-4fae-b112-a9778e1cee38-kube-api-access-b99s2" (OuterVolumeSpecName: "kube-api-access-b99s2") pod "3a9fe8c0-2ce7-4fae-b112-a9778e1cee38" (UID: "3a9fe8c0-2ce7-4fae-b112-a9778e1cee38"). InnerVolumeSpecName "kube-api-access-b99s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:38:04 crc kubenswrapper[4742]: I0317 11:38:04.929325 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b99s2\" (UniqueName: \"kubernetes.io/projected/3a9fe8c0-2ce7-4fae-b112-a9778e1cee38-kube-api-access-b99s2\") on node \"crc\" DevicePath \"\"" Mar 17 11:38:05 crc kubenswrapper[4742]: I0317 11:38:05.297701 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562458-dphx7" event={"ID":"3a9fe8c0-2ce7-4fae-b112-a9778e1cee38","Type":"ContainerDied","Data":"f2d57f0b092ca2e4eeca171467c8c08aca23aa99c12dc707d8f95980884943e1"} Mar 17 11:38:05 crc kubenswrapper[4742]: I0317 11:38:05.297994 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2d57f0b092ca2e4eeca171467c8c08aca23aa99c12dc707d8f95980884943e1" Mar 17 11:38:05 crc kubenswrapper[4742]: I0317 11:38:05.297797 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562458-dphx7" Mar 17 11:38:05 crc kubenswrapper[4742]: I0317 11:38:05.743481 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562452-vfzcm"] Mar 17 11:38:05 crc kubenswrapper[4742]: I0317 11:38:05.751286 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562452-vfzcm"] Mar 17 11:38:06 crc kubenswrapper[4742]: I0317 11:38:06.685733 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f1bf7f-425d-46dc-945b-64afbc107101" path="/var/lib/kubelet/pods/86f1bf7f-425d-46dc-945b-64afbc107101/volumes" Mar 17 11:38:18 crc kubenswrapper[4742]: I0317 11:38:18.043865 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:38:18 crc kubenswrapper[4742]: I0317 11:38:18.044602 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:38:18 crc kubenswrapper[4742]: I0317 11:38:18.044658 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:38:18 crc kubenswrapper[4742]: I0317 11:38:18.045626 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 11:38:18 crc kubenswrapper[4742]: I0317 11:38:18.045687 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" gracePeriod=600 Mar 17 11:38:18 crc kubenswrapper[4742]: E0317 11:38:18.202661 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:38:18 crc kubenswrapper[4742]: I0317 11:38:18.425589 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" exitCode=0 Mar 17 11:38:18 crc kubenswrapper[4742]: I0317 11:38:18.425646 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d"} Mar 17 11:38:18 crc kubenswrapper[4742]: I0317 11:38:18.425682 4742 scope.go:117] "RemoveContainer" containerID="1aeee9892509f65c6f012471968b84d5122ab43ea074794d2d7aecfdfae8d433" Mar 17 11:38:18 crc kubenswrapper[4742]: I0317 11:38:18.426346 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:38:18 crc kubenswrapper[4742]: E0317 11:38:18.426759 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.242753 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-77hd8"] Mar 17 11:38:27 crc kubenswrapper[4742]: E0317 11:38:27.243878 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9fe8c0-2ce7-4fae-b112-a9778e1cee38" containerName="oc" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.243894 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9fe8c0-2ce7-4fae-b112-a9778e1cee38" containerName="oc" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.244155 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9fe8c0-2ce7-4fae-b112-a9778e1cee38" containerName="oc" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.245808 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.263415 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77hd8"] Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.316432 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-utilities\") pod \"certified-operators-77hd8\" (UID: \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\") " pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.316789 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czj9g\" (UniqueName: \"kubernetes.io/projected/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-kube-api-access-czj9g\") pod \"certified-operators-77hd8\" (UID: \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\") " pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.316848 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-catalog-content\") pod \"certified-operators-77hd8\" (UID: \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\") " pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.418313 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-utilities\") pod \"certified-operators-77hd8\" (UID: \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\") " pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.418406 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czj9g\" (UniqueName: \"kubernetes.io/projected/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-kube-api-access-czj9g\") pod \"certified-operators-77hd8\" (UID: \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\") " pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.418478 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-catalog-content\") pod \"certified-operators-77hd8\" (UID: \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\") " pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.419100 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-utilities\") pod \"certified-operators-77hd8\" (UID: \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\") " pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.419360 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-catalog-content\") pod \"certified-operators-77hd8\" (UID: \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\") " pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.444621 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czj9g\" (UniqueName: \"kubernetes.io/projected/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-kube-api-access-czj9g\") pod \"certified-operators-77hd8\" (UID: \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\") " pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:27 crc kubenswrapper[4742]: I0317 11:38:27.582364 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:28 crc kubenswrapper[4742]: I0317 11:38:28.473897 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77hd8"] Mar 17 11:38:28 crc kubenswrapper[4742]: I0317 11:38:28.566106 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77hd8" event={"ID":"63dac183-f47c-4ce9-a7f7-75f14f7b52a2","Type":"ContainerStarted","Data":"a1b6b803dad5b58f31438d9c9fb7105c0524736622dc997a4e330ee097e47d26"} Mar 17 11:38:29 crc kubenswrapper[4742]: I0317 11:38:29.576610 4742 generic.go:334] "Generic (PLEG): container finished" podID="63dac183-f47c-4ce9-a7f7-75f14f7b52a2" containerID="496ffaed286d4bbdd8aea438cfa5c99288b961937755983f9967f79cc3263f39" exitCode=0 Mar 17 11:38:29 crc kubenswrapper[4742]: I0317 11:38:29.576737 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77hd8" event={"ID":"63dac183-f47c-4ce9-a7f7-75f14f7b52a2","Type":"ContainerDied","Data":"496ffaed286d4bbdd8aea438cfa5c99288b961937755983f9967f79cc3263f39"} Mar 17 11:38:29 crc kubenswrapper[4742]: I0317 11:38:29.662866 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:38:29 crc kubenswrapper[4742]: E0317 11:38:29.663281 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:38:31 crc kubenswrapper[4742]: I0317 11:38:31.605176 4742 generic.go:334] "Generic (PLEG): container finished" podID="63dac183-f47c-4ce9-a7f7-75f14f7b52a2" containerID="a9b0143fe277572717aa3948febe4c6dbc5b448767b4e5a2bdea5b9209b854ed" exitCode=0 Mar 17 11:38:31 crc kubenswrapper[4742]: I0317 11:38:31.605254 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77hd8" event={"ID":"63dac183-f47c-4ce9-a7f7-75f14f7b52a2","Type":"ContainerDied","Data":"a9b0143fe277572717aa3948febe4c6dbc5b448767b4e5a2bdea5b9209b854ed"} Mar 17 11:38:32 crc kubenswrapper[4742]: I0317 11:38:32.616423 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77hd8" event={"ID":"63dac183-f47c-4ce9-a7f7-75f14f7b52a2","Type":"ContainerStarted","Data":"f91d6b711bc3c196b44585eedd81e6ae3d62740d942f1348b00f42ffbce66307"} Mar 17 11:38:32 crc kubenswrapper[4742]: I0317 11:38:32.639753 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-77hd8" podStartSLOduration=3.1229135279999998 podStartE2EDuration="5.639736307s" podCreationTimestamp="2026-03-17 11:38:27 +0000 UTC" firstStartedPulling="2026-03-17 11:38:29.58022489 +0000 UTC m=+1612.706352658" lastFinishedPulling="2026-03-17 11:38:32.097047649 +0000 UTC m=+1615.223175437" observedRunningTime="2026-03-17 11:38:32.634956693 +0000 UTC m=+1615.761084461" watchObservedRunningTime="2026-03-17 11:38:32.639736307 +0000 UTC m=+1615.765864065" Mar 17 11:38:37 crc kubenswrapper[4742]: I0317 11:38:37.583194 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:37 crc kubenswrapper[4742]: I0317 11:38:37.583784 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:37 crc kubenswrapper[4742]: I0317 11:38:37.696094 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:37 crc kubenswrapper[4742]: I0317 11:38:37.773396 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:37 crc kubenswrapper[4742]: I0317 11:38:37.939074 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77hd8"] Mar 17 11:38:39 crc kubenswrapper[4742]: I0317 11:38:39.714346 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-77hd8" podUID="63dac183-f47c-4ce9-a7f7-75f14f7b52a2" containerName="registry-server" containerID="cri-o://f91d6b711bc3c196b44585eedd81e6ae3d62740d942f1348b00f42ffbce66307" gracePeriod=2 Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.266606 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.284417 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czj9g\" (UniqueName: \"kubernetes.io/projected/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-kube-api-access-czj9g\") pod \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\" (UID: \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\") " Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.284496 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-utilities\") pod \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\" (UID: \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\") " Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.284535 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-catalog-content\") pod \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\" (UID: \"63dac183-f47c-4ce9-a7f7-75f14f7b52a2\") " Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.287411 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-utilities" (OuterVolumeSpecName: "utilities") pod "63dac183-f47c-4ce9-a7f7-75f14f7b52a2" (UID: "63dac183-f47c-4ce9-a7f7-75f14f7b52a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.296871 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-kube-api-access-czj9g" (OuterVolumeSpecName: "kube-api-access-czj9g") pod "63dac183-f47c-4ce9-a7f7-75f14f7b52a2" (UID: "63dac183-f47c-4ce9-a7f7-75f14f7b52a2"). InnerVolumeSpecName "kube-api-access-czj9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.355688 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63dac183-f47c-4ce9-a7f7-75f14f7b52a2" (UID: "63dac183-f47c-4ce9-a7f7-75f14f7b52a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.386499 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czj9g\" (UniqueName: \"kubernetes.io/projected/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-kube-api-access-czj9g\") on node \"crc\" DevicePath \"\"" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.386536 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.386551 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dac183-f47c-4ce9-a7f7-75f14f7b52a2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.726791 4742 generic.go:334] "Generic (PLEG): container finished" podID="63dac183-f47c-4ce9-a7f7-75f14f7b52a2" containerID="f91d6b711bc3c196b44585eedd81e6ae3d62740d942f1348b00f42ffbce66307" exitCode=0 Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.726854 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77hd8" event={"ID":"63dac183-f47c-4ce9-a7f7-75f14f7b52a2","Type":"ContainerDied","Data":"f91d6b711bc3c196b44585eedd81e6ae3d62740d942f1348b00f42ffbce66307"} Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.726886 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77hd8" event={"ID":"63dac183-f47c-4ce9-a7f7-75f14f7b52a2","Type":"ContainerDied","Data":"a1b6b803dad5b58f31438d9c9fb7105c0524736622dc997a4e330ee097e47d26"} Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.726945 4742 scope.go:117] "RemoveContainer" containerID="f91d6b711bc3c196b44585eedd81e6ae3d62740d942f1348b00f42ffbce66307" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.726942 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77hd8" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.754646 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77hd8"] Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.765652 4742 scope.go:117] "RemoveContainer" containerID="a9b0143fe277572717aa3948febe4c6dbc5b448767b4e5a2bdea5b9209b854ed" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.768087 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-77hd8"] Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.838561 4742 scope.go:117] "RemoveContainer" containerID="496ffaed286d4bbdd8aea438cfa5c99288b961937755983f9967f79cc3263f39" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.871503 4742 scope.go:117] "RemoveContainer" containerID="f91d6b711bc3c196b44585eedd81e6ae3d62740d942f1348b00f42ffbce66307" Mar 17 11:38:40 crc kubenswrapper[4742]: E0317 11:38:40.871983 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f91d6b711bc3c196b44585eedd81e6ae3d62740d942f1348b00f42ffbce66307\": container with ID starting with f91d6b711bc3c196b44585eedd81e6ae3d62740d942f1348b00f42ffbce66307 not found: ID does not exist" containerID="f91d6b711bc3c196b44585eedd81e6ae3d62740d942f1348b00f42ffbce66307" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.872031 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f91d6b711bc3c196b44585eedd81e6ae3d62740d942f1348b00f42ffbce66307"} err="failed to get container status \"f91d6b711bc3c196b44585eedd81e6ae3d62740d942f1348b00f42ffbce66307\": rpc error: code = NotFound desc = could not find container \"f91d6b711bc3c196b44585eedd81e6ae3d62740d942f1348b00f42ffbce66307\": container with ID starting with f91d6b711bc3c196b44585eedd81e6ae3d62740d942f1348b00f42ffbce66307 not found: ID does not exist" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.872054 4742 scope.go:117] "RemoveContainer" containerID="a9b0143fe277572717aa3948febe4c6dbc5b448767b4e5a2bdea5b9209b854ed" Mar 17 11:38:40 crc kubenswrapper[4742]: E0317 11:38:40.872549 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9b0143fe277572717aa3948febe4c6dbc5b448767b4e5a2bdea5b9209b854ed\": container with ID starting with a9b0143fe277572717aa3948febe4c6dbc5b448767b4e5a2bdea5b9209b854ed not found: ID does not exist" containerID="a9b0143fe277572717aa3948febe4c6dbc5b448767b4e5a2bdea5b9209b854ed" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.872627 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9b0143fe277572717aa3948febe4c6dbc5b448767b4e5a2bdea5b9209b854ed"} err="failed to get container status \"a9b0143fe277572717aa3948febe4c6dbc5b448767b4e5a2bdea5b9209b854ed\": rpc error: code = NotFound desc = could not find container \"a9b0143fe277572717aa3948febe4c6dbc5b448767b4e5a2bdea5b9209b854ed\": container with ID starting with a9b0143fe277572717aa3948febe4c6dbc5b448767b4e5a2bdea5b9209b854ed not found: ID does not exist" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.872680 4742 scope.go:117] "RemoveContainer" containerID="496ffaed286d4bbdd8aea438cfa5c99288b961937755983f9967f79cc3263f39" Mar 17 11:38:40 crc kubenswrapper[4742]: E0317 11:38:40.873237 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"496ffaed286d4bbdd8aea438cfa5c99288b961937755983f9967f79cc3263f39\": container with ID starting with 496ffaed286d4bbdd8aea438cfa5c99288b961937755983f9967f79cc3263f39 not found: ID does not exist" containerID="496ffaed286d4bbdd8aea438cfa5c99288b961937755983f9967f79cc3263f39" Mar 17 11:38:40 crc kubenswrapper[4742]: I0317 11:38:40.873266 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"496ffaed286d4bbdd8aea438cfa5c99288b961937755983f9967f79cc3263f39"} err="failed to get container status \"496ffaed286d4bbdd8aea438cfa5c99288b961937755983f9967f79cc3263f39\": rpc error: code = NotFound desc = could not find container \"496ffaed286d4bbdd8aea438cfa5c99288b961937755983f9967f79cc3263f39\": container with ID starting with 496ffaed286d4bbdd8aea438cfa5c99288b961937755983f9967f79cc3263f39 not found: ID does not exist" Mar 17 11:38:41 crc kubenswrapper[4742]: I0317 11:38:41.663520 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:38:41 crc kubenswrapper[4742]: E0317 11:38:41.663984 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:38:42 crc kubenswrapper[4742]: I0317 11:38:42.674609 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63dac183-f47c-4ce9-a7f7-75f14f7b52a2" path="/var/lib/kubelet/pods/63dac183-f47c-4ce9-a7f7-75f14f7b52a2/volumes" Mar 17 11:38:52 crc kubenswrapper[4742]: I0317 11:38:52.458127 4742 scope.go:117] "RemoveContainer" containerID="8eb2067fd4abccdf70a7351181706842d5af7477c7771cd486a8c0c2d41da946" Mar 17 11:38:52 crc kubenswrapper[4742]: I0317 11:38:52.524803 4742 scope.go:117] "RemoveContainer" containerID="edd0f9d20440eac5de4e8baf73493f31ba4d5d6aa7d7cc31b9cf148c15d9e47e" Mar 17 11:38:52 crc kubenswrapper[4742]: I0317 11:38:52.566975 4742 scope.go:117] "RemoveContainer" containerID="eb0dc37886b13f72e39bed5c9ab7ebbedd6f9e9cbd96d9aff5c8c8e4cf61f6c8" Mar 17 11:38:52 crc kubenswrapper[4742]: I0317 11:38:52.617328 4742 scope.go:117] "RemoveContainer" containerID="27ebc91815b4e6960eebc1a252e32864ae7dbbb0a6c3f4a56b1ebb4dccc14eaa" Mar 17 11:38:52 crc kubenswrapper[4742]: I0317 11:38:52.663138 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:38:52 crc kubenswrapper[4742]: E0317 11:38:52.663495 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:38:52 crc kubenswrapper[4742]: I0317 11:38:52.680129 4742 scope.go:117] "RemoveContainer" containerID="a39d766731a30ce0525a21395e6a75060ba4e8d7e54fa71ec5505353c307d099" Mar 17 11:38:52 crc kubenswrapper[4742]: I0317 11:38:52.709299 4742 scope.go:117] "RemoveContainer" containerID="0eba506e27377ff6e5b042e006a886dd1dcb2d7f32146237492a30397fd488f3" Mar 17 11:38:52 crc kubenswrapper[4742]: I0317 11:38:52.756296 4742 scope.go:117] "RemoveContainer" containerID="d1650d3cc9b26e02486db88fcd53040074b628ab99a1931281a1204591ad8624" Mar 17 11:38:52 crc kubenswrapper[4742]: I0317 11:38:52.808582 4742 scope.go:117] "RemoveContainer" containerID="e2282d37b0f4321b59cd38125bab6b08fe6bc64fa6ae9994352065ed6574c832" Mar 17 11:38:52 crc kubenswrapper[4742]: I0317 11:38:52.836947 4742 scope.go:117] "RemoveContainer" containerID="a171d31afe2a8fee197c1e824702384e0fe66f168432f5b13b1587e1abe0d3d0" Mar 17 11:38:52 crc kubenswrapper[4742]: I0317 11:38:52.868592 4742 scope.go:117] "RemoveContainer" containerID="b61e82d3be7e8c91ce430324c0ef11f59aeec617d63b6c7eadb8f30bc5b111f0" Mar 17 11:39:07 crc kubenswrapper[4742]: I0317 11:39:07.663498 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:39:07 crc kubenswrapper[4742]: E0317 11:39:07.664448 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:39:18 crc kubenswrapper[4742]: I0317 11:39:18.669808 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:39:18 crc kubenswrapper[4742]: E0317 11:39:18.670500 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:39:29 crc kubenswrapper[4742]: I0317 11:39:29.663395 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:39:29 crc kubenswrapper[4742]: E0317 11:39:29.664618 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:39:42 crc kubenswrapper[4742]: I0317 11:39:42.663550 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:39:42 crc kubenswrapper[4742]: E0317 11:39:42.664492 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:39:53 crc kubenswrapper[4742]: I0317 11:39:53.662781 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:39:53 crc kubenswrapper[4742]: E0317 11:39:53.663644 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.203121 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562460-6xd9m"] Mar 17 11:40:00 crc kubenswrapper[4742]: E0317 11:40:00.204598 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dac183-f47c-4ce9-a7f7-75f14f7b52a2" containerName="extract-utilities" Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.204627 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dac183-f47c-4ce9-a7f7-75f14f7b52a2" containerName="extract-utilities" Mar 17 11:40:00 crc kubenswrapper[4742]: E0317 11:40:00.204659 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dac183-f47c-4ce9-a7f7-75f14f7b52a2" containerName="extract-content" Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.204672 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dac183-f47c-4ce9-a7f7-75f14f7b52a2" containerName="extract-content" Mar 17 11:40:00 crc kubenswrapper[4742]: E0317 11:40:00.204700 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dac183-f47c-4ce9-a7f7-75f14f7b52a2" containerName="registry-server" Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.204714 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dac183-f47c-4ce9-a7f7-75f14f7b52a2" containerName="registry-server" Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.205111 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="63dac183-f47c-4ce9-a7f7-75f14f7b52a2" containerName="registry-server" Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.206407 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562460-6xd9m" Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.208698 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.209216 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.209365 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.231064 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562460-6xd9m"] Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.347441 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlnct\" (UniqueName: \"kubernetes.io/projected/f5d96418-f216-474c-a8e5-d73833a30fd8-kube-api-access-tlnct\") pod \"auto-csr-approver-29562460-6xd9m\" (UID: \"f5d96418-f216-474c-a8e5-d73833a30fd8\") " pod="openshift-infra/auto-csr-approver-29562460-6xd9m" Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.449550 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnct\" (UniqueName: \"kubernetes.io/projected/f5d96418-f216-474c-a8e5-d73833a30fd8-kube-api-access-tlnct\") pod \"auto-csr-approver-29562460-6xd9m\" (UID: \"f5d96418-f216-474c-a8e5-d73833a30fd8\") " pod="openshift-infra/auto-csr-approver-29562460-6xd9m" Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.469827 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlnct\" (UniqueName: \"kubernetes.io/projected/f5d96418-f216-474c-a8e5-d73833a30fd8-kube-api-access-tlnct\") pod \"auto-csr-approver-29562460-6xd9m\" (UID: \"f5d96418-f216-474c-a8e5-d73833a30fd8\") " pod="openshift-infra/auto-csr-approver-29562460-6xd9m" Mar 17 11:40:00 crc kubenswrapper[4742]: I0317 11:40:00.536886 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562460-6xd9m" Mar 17 11:40:01 crc kubenswrapper[4742]: I0317 11:40:01.001707 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562460-6xd9m"] Mar 17 11:40:01 crc kubenswrapper[4742]: I0317 11:40:01.697099 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562460-6xd9m" event={"ID":"f5d96418-f216-474c-a8e5-d73833a30fd8","Type":"ContainerStarted","Data":"8fa25a6eae8c47844d841b254b62eb28940239dcebd561ad66d045cac3fc7ed0"} Mar 17 11:40:02 crc kubenswrapper[4742]: I0317 11:40:02.707363 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562460-6xd9m" event={"ID":"f5d96418-f216-474c-a8e5-d73833a30fd8","Type":"ContainerStarted","Data":"338466036db2afa1f87e10d7311dfc4b58aebdb5b513416a8bc75539002cc133"} Mar 17 11:40:02 crc kubenswrapper[4742]: I0317 11:40:02.739095 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562460-6xd9m" podStartSLOduration=1.564828254 podStartE2EDuration="2.739074778s" podCreationTimestamp="2026-03-17 11:40:00 +0000 UTC" firstStartedPulling="2026-03-17 11:40:01.015364962 +0000 UTC m=+1704.141492710" lastFinishedPulling="2026-03-17 11:40:02.189611436 +0000 UTC m=+1705.315739234" observedRunningTime="2026-03-17 11:40:02.731515919 +0000 UTC m=+1705.857643717" watchObservedRunningTime="2026-03-17 11:40:02.739074778 +0000 UTC m=+1705.865202536" Mar 17 11:40:03 crc kubenswrapper[4742]: I0317 11:40:03.721928 4742 generic.go:334] "Generic (PLEG): container finished" podID="f5d96418-f216-474c-a8e5-d73833a30fd8" containerID="338466036db2afa1f87e10d7311dfc4b58aebdb5b513416a8bc75539002cc133" exitCode=0 Mar 17 11:40:03 crc kubenswrapper[4742]: I0317 11:40:03.722028 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562460-6xd9m" event={"ID":"f5d96418-f216-474c-a8e5-d73833a30fd8","Type":"ContainerDied","Data":"338466036db2afa1f87e10d7311dfc4b58aebdb5b513416a8bc75539002cc133"} Mar 17 11:40:04 crc kubenswrapper[4742]: I0317 11:40:04.662856 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:40:04 crc kubenswrapper[4742]: E0317 11:40:04.663202 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:40:05 crc kubenswrapper[4742]: I0317 11:40:05.130987 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562460-6xd9m" Mar 17 11:40:05 crc kubenswrapper[4742]: I0317 11:40:05.251985 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlnct\" (UniqueName: \"kubernetes.io/projected/f5d96418-f216-474c-a8e5-d73833a30fd8-kube-api-access-tlnct\") pod \"f5d96418-f216-474c-a8e5-d73833a30fd8\" (UID: \"f5d96418-f216-474c-a8e5-d73833a30fd8\") " Mar 17 11:40:05 crc kubenswrapper[4742]: I0317 11:40:05.262733 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d96418-f216-474c-a8e5-d73833a30fd8-kube-api-access-tlnct" (OuterVolumeSpecName: "kube-api-access-tlnct") pod "f5d96418-f216-474c-a8e5-d73833a30fd8" (UID: "f5d96418-f216-474c-a8e5-d73833a30fd8"). InnerVolumeSpecName "kube-api-access-tlnct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:40:05 crc kubenswrapper[4742]: I0317 11:40:05.353944 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlnct\" (UniqueName: \"kubernetes.io/projected/f5d96418-f216-474c-a8e5-d73833a30fd8-kube-api-access-tlnct\") on node \"crc\" DevicePath \"\"" Mar 17 11:40:05 crc kubenswrapper[4742]: I0317 11:40:05.749139 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562460-6xd9m" event={"ID":"f5d96418-f216-474c-a8e5-d73833a30fd8","Type":"ContainerDied","Data":"8fa25a6eae8c47844d841b254b62eb28940239dcebd561ad66d045cac3fc7ed0"} Mar 17 11:40:05 crc kubenswrapper[4742]: I0317 11:40:05.749189 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fa25a6eae8c47844d841b254b62eb28940239dcebd561ad66d045cac3fc7ed0" Mar 17 11:40:05 crc kubenswrapper[4742]: I0317 11:40:05.749268 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562460-6xd9m" Mar 17 11:40:05 crc kubenswrapper[4742]: I0317 11:40:05.814696 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562454-s7z5z"] Mar 17 11:40:05 crc kubenswrapper[4742]: I0317 11:40:05.825502 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562454-s7z5z"] Mar 17 11:40:06 crc kubenswrapper[4742]: I0317 11:40:06.678640 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df" path="/var/lib/kubelet/pods/bd3b61e4-4b1d-4ba7-ba30-068a43a8d8df/volumes" Mar 17 11:40:18 crc kubenswrapper[4742]: I0317 11:40:18.663272 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:40:18 crc kubenswrapper[4742]: E0317 11:40:18.664097 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:40:18 crc kubenswrapper[4742]: I0317 11:40:18.906976 4742 generic.go:334] "Generic (PLEG): container finished" podID="e6bf81f0-73d3-4dde-937d-87bbea94c36e" containerID="3e919279a68a16517086c2944a547a01a33d918ac6faff393c030aa60ace2ba5" exitCode=0 Mar 17 11:40:18 crc kubenswrapper[4742]: I0317 11:40:18.907044 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" event={"ID":"e6bf81f0-73d3-4dde-937d-87bbea94c36e","Type":"ContainerDied","Data":"3e919279a68a16517086c2944a547a01a33d918ac6faff393c030aa60ace2ba5"} Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.395435 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.482012 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-inventory\") pod \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.482400 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-bootstrap-combined-ca-bundle\") pod \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.482467 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cpjh\" (UniqueName: \"kubernetes.io/projected/e6bf81f0-73d3-4dde-937d-87bbea94c36e-kube-api-access-5cpjh\") pod \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.482500 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-ssh-key-openstack-edpm-ipam\") pod \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\" (UID: \"e6bf81f0-73d3-4dde-937d-87bbea94c36e\") " Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.487550 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6bf81f0-73d3-4dde-937d-87bbea94c36e-kube-api-access-5cpjh" (OuterVolumeSpecName: "kube-api-access-5cpjh") pod "e6bf81f0-73d3-4dde-937d-87bbea94c36e" (UID: "e6bf81f0-73d3-4dde-937d-87bbea94c36e"). InnerVolumeSpecName "kube-api-access-5cpjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.489542 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e6bf81f0-73d3-4dde-937d-87bbea94c36e" (UID: "e6bf81f0-73d3-4dde-937d-87bbea94c36e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.513101 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-inventory" (OuterVolumeSpecName: "inventory") pod "e6bf81f0-73d3-4dde-937d-87bbea94c36e" (UID: "e6bf81f0-73d3-4dde-937d-87bbea94c36e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.515625 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e6bf81f0-73d3-4dde-937d-87bbea94c36e" (UID: "e6bf81f0-73d3-4dde-937d-87bbea94c36e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.584520 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cpjh\" (UniqueName: \"kubernetes.io/projected/e6bf81f0-73d3-4dde-937d-87bbea94c36e-kube-api-access-5cpjh\") on node \"crc\" DevicePath \"\"" Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.584556 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.584568 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.584576 4742 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6bf81f0-73d3-4dde-937d-87bbea94c36e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.937992 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" event={"ID":"e6bf81f0-73d3-4dde-937d-87bbea94c36e","Type":"ContainerDied","Data":"14c7e39d8287c10268c2cc432d93fa47bbb77f98a36dd12d9d9c8387761f113f"} Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.938055 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14c7e39d8287c10268c2cc432d93fa47bbb77f98a36dd12d9d9c8387761f113f" Mar 17 11:40:20 crc kubenswrapper[4742]: I0317 11:40:20.938073 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.055570 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds"] Mar 17 11:40:21 crc kubenswrapper[4742]: E0317 11:40:21.057012 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d96418-f216-474c-a8e5-d73833a30fd8" containerName="oc" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.057060 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d96418-f216-474c-a8e5-d73833a30fd8" containerName="oc" Mar 17 11:40:21 crc kubenswrapper[4742]: E0317 11:40:21.057089 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6bf81f0-73d3-4dde-937d-87bbea94c36e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.057096 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6bf81f0-73d3-4dde-937d-87bbea94c36e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.057474 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d96418-f216-474c-a8e5-d73833a30fd8" containerName="oc" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.057511 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6bf81f0-73d3-4dde-937d-87bbea94c36e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.058401 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.061632 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.061983 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.062202 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.064343 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.085933 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds"] Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.093937 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmj5v\" (UniqueName: \"kubernetes.io/projected/a8691841-aa32-407b-bbdc-97c5551ec591-kube-api-access-mmj5v\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mvrds\" (UID: \"a8691841-aa32-407b-bbdc-97c5551ec591\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.094123 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8691841-aa32-407b-bbdc-97c5551ec591-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mvrds\" (UID: \"a8691841-aa32-407b-bbdc-97c5551ec591\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.094214 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8691841-aa32-407b-bbdc-97c5551ec591-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mvrds\" (UID: \"a8691841-aa32-407b-bbdc-97c5551ec591\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.196257 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmj5v\" (UniqueName: \"kubernetes.io/projected/a8691841-aa32-407b-bbdc-97c5551ec591-kube-api-access-mmj5v\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mvrds\" (UID: \"a8691841-aa32-407b-bbdc-97c5551ec591\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.196347 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8691841-aa32-407b-bbdc-97c5551ec591-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mvrds\" (UID: \"a8691841-aa32-407b-bbdc-97c5551ec591\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.196404 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8691841-aa32-407b-bbdc-97c5551ec591-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mvrds\" (UID: \"a8691841-aa32-407b-bbdc-97c5551ec591\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.200946 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8691841-aa32-407b-bbdc-97c5551ec591-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mvrds\" (UID: \"a8691841-aa32-407b-bbdc-97c5551ec591\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.212856 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8691841-aa32-407b-bbdc-97c5551ec591-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mvrds\" (UID: \"a8691841-aa32-407b-bbdc-97c5551ec591\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.231851 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmj5v\" (UniqueName: \"kubernetes.io/projected/a8691841-aa32-407b-bbdc-97c5551ec591-kube-api-access-mmj5v\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mvrds\" (UID: \"a8691841-aa32-407b-bbdc-97c5551ec591\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.392629 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" Mar 17 11:40:21 crc kubenswrapper[4742]: I0317 11:40:21.988044 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds"] Mar 17 11:40:22 crc kubenswrapper[4742]: I0317 11:40:22.966556 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" event={"ID":"a8691841-aa32-407b-bbdc-97c5551ec591","Type":"ContainerStarted","Data":"0382835c71e7829915c75cc32af1ce54d83dc9c01266fca1da2c8afe882a0723"} Mar 17 11:40:22 crc kubenswrapper[4742]: I0317 11:40:22.967223 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" event={"ID":"a8691841-aa32-407b-bbdc-97c5551ec591","Type":"ContainerStarted","Data":"19aa26a89012a7880a99ba5ed1d440fd69ca40a56be7ad3c73baafd281cab047"} Mar 17 11:40:22 crc kubenswrapper[4742]: I0317 11:40:22.992802 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" podStartSLOduration=1.342599995 podStartE2EDuration="1.992785895s" podCreationTimestamp="2026-03-17 11:40:21 +0000 UTC" firstStartedPulling="2026-03-17 11:40:21.984504603 +0000 UTC m=+1725.110632381" lastFinishedPulling="2026-03-17 11:40:22.634690523 +0000 UTC m=+1725.760818281" observedRunningTime="2026-03-17 11:40:22.990514182 +0000 UTC m=+1726.116641930" watchObservedRunningTime="2026-03-17 11:40:22.992785895 +0000 UTC m=+1726.118913653" Mar 17 11:40:32 crc kubenswrapper[4742]: I0317 11:40:32.663170 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:40:32 crc kubenswrapper[4742]: E0317 11:40:32.664549 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:40:46 crc kubenswrapper[4742]: I0317 11:40:46.663107 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:40:46 crc kubenswrapper[4742]: E0317 11:40:46.664448 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:40:53 crc kubenswrapper[4742]: I0317 11:40:53.118012 4742 scope.go:117] "RemoveContainer" containerID="9a27c20a440ff32b968b604ff8353be71ac1dc698779ef00ba40c41a91b652a5" Mar 17 11:40:53 crc kubenswrapper[4742]: I0317 11:40:53.158027 4742 scope.go:117] "RemoveContainer" containerID="e9ea045617080df7a23906a3524c3d38b58b57ae4361803d875fd610a01afe54" Mar 17 11:41:00 crc kubenswrapper[4742]: I0317 11:41:00.664681 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:41:00 crc kubenswrapper[4742]: E0317 11:41:00.666009 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:41:12 crc kubenswrapper[4742]: I0317 11:41:12.664821 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:41:12 crc kubenswrapper[4742]: E0317 11:41:12.666396 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:41:27 crc kubenswrapper[4742]: I0317 11:41:27.680004 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:41:27 crc kubenswrapper[4742]: E0317 11:41:27.681366 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:41:37 crc kubenswrapper[4742]: I0317 11:41:37.060468 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-18db-account-create-update-psfh5"] Mar 17 11:41:37 crc kubenswrapper[4742]: I0317 11:41:37.074863 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-18db-account-create-update-psfh5"] Mar 17 11:41:37 crc kubenswrapper[4742]: I0317 11:41:37.085223 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-065f-account-create-update-cxjdl"] Mar 17 11:41:37 crc kubenswrapper[4742]: I0317 11:41:37.099602 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-wp4gd"] Mar 17 11:41:37 crc kubenswrapper[4742]: I0317 11:41:37.108020 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-97pcv"] Mar 17 11:41:37 crc kubenswrapper[4742]: I0317 11:41:37.117435 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-97pcv"] Mar 17 11:41:37 crc kubenswrapper[4742]: I0317 11:41:37.125553 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-wp4gd"] Mar 17 11:41:37 crc kubenswrapper[4742]: I0317 11:41:37.132938 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-065f-account-create-update-cxjdl"] Mar 17 11:41:38 crc kubenswrapper[4742]: I0317 11:41:38.679589 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19a4cab3-fe96-424d-a768-741b2c01d8e0" path="/var/lib/kubelet/pods/19a4cab3-fe96-424d-a768-741b2c01d8e0/volumes" Mar 17 11:41:38 crc kubenswrapper[4742]: I0317 11:41:38.681008 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28db6ae8-7bd6-4c58-9b49-7349349da904" path="/var/lib/kubelet/pods/28db6ae8-7bd6-4c58-9b49-7349349da904/volumes" Mar 17 11:41:38 crc kubenswrapper[4742]: I0317 11:41:38.681811 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d80d7e30-5242-48a7-b61b-6e3d74364128" path="/var/lib/kubelet/pods/d80d7e30-5242-48a7-b61b-6e3d74364128/volumes" Mar 17 11:41:38 crc kubenswrapper[4742]: I0317 11:41:38.682575 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d84251cb-eea2-41f7-b743-ab3a4d0c4ae1" path="/var/lib/kubelet/pods/d84251cb-eea2-41f7-b743-ab3a4d0c4ae1/volumes" Mar 17 11:41:41 crc kubenswrapper[4742]: I0317 11:41:41.042429 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-cqlsx"] Mar 17 11:41:41 crc kubenswrapper[4742]: I0317 11:41:41.057753 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8f69-account-create-update-pwhwq"] Mar 17 11:41:41 crc kubenswrapper[4742]: I0317 11:41:41.069989 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-cqlsx"] Mar 17 11:41:41 crc kubenswrapper[4742]: I0317 11:41:41.081987 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8f69-account-create-update-pwhwq"] Mar 17 11:41:42 crc kubenswrapper[4742]: I0317 11:41:42.662928 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:41:42 crc kubenswrapper[4742]: E0317 11:41:42.663168 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:41:42 crc kubenswrapper[4742]: I0317 11:41:42.674463 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308384f8-4874-467d-92e9-d5078d3017b5" path="/var/lib/kubelet/pods/308384f8-4874-467d-92e9-d5078d3017b5/volumes" Mar 17 11:41:42 crc kubenswrapper[4742]: I0317 11:41:42.675178 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f" path="/var/lib/kubelet/pods/86c59a5d-73d4-45e9-bb2f-cbf9fa687b6f/volumes" Mar 17 11:41:53 crc kubenswrapper[4742]: I0317 11:41:53.255004 4742 scope.go:117] "RemoveContainer" containerID="fa6cb733ef3884e320cea26203b6b8787d65cee2e29770deaed5aba2116e8767" Mar 17 11:41:53 crc kubenswrapper[4742]: I0317 11:41:53.283532 4742 scope.go:117] "RemoveContainer" containerID="b9f2b21d6dbf29c7d2ced36e179de5481f77bed73d7e2dd9e4e7cd9bf055b81e" Mar 17 11:41:53 crc kubenswrapper[4742]: I0317 11:41:53.314456 4742 scope.go:117] "RemoveContainer" containerID="a9f80e4999e79f490ef91c75e4f6c00600be5edfe0a5f2a06f1503c01a02c9de" Mar 17 11:41:53 crc kubenswrapper[4742]: I0317 11:41:53.339725 4742 scope.go:117] "RemoveContainer" containerID="e29ae2e2808df9beb7293f4ecf1cda6fd49e1a8e0254b2fdfa6cae19752cba69" Mar 17 11:41:53 crc kubenswrapper[4742]: I0317 11:41:53.422310 4742 scope.go:117] "RemoveContainer" containerID="dc053eca8afdef59c2d596b1cd594e09c250119abb63e9f7c1a1de6724d9bac1" Mar 17 11:41:53 crc kubenswrapper[4742]: I0317 11:41:53.460493 4742 scope.go:117] "RemoveContainer" containerID="af718105c77fc34a33fccece18d4b68331853b7c0e36f08268c37e336bcf5dcd" Mar 17 11:41:53 crc kubenswrapper[4742]: I0317 11:41:53.517640 4742 scope.go:117] "RemoveContainer" containerID="d3abbd19ee12b5bda3502280acd949d13d0a02256a13414347d7f4740c40d154" Mar 17 11:41:53 crc kubenswrapper[4742]: I0317 11:41:53.542121 4742 scope.go:117] "RemoveContainer" containerID="5b537e8a453a925bd038aa6d28e37d38d71cac184d10382ddefd9f7537a455a0" Mar 17 11:41:54 crc kubenswrapper[4742]: I0317 11:41:54.663691 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:41:54 crc kubenswrapper[4742]: E0317 11:41:54.664545 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:42:00 crc kubenswrapper[4742]: I0317 11:42:00.151402 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562462-8bc7k"] Mar 17 11:42:00 crc kubenswrapper[4742]: I0317 11:42:00.153074 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562462-8bc7k" Mar 17 11:42:00 crc kubenswrapper[4742]: I0317 11:42:00.155997 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:42:00 crc kubenswrapper[4742]: I0317 11:42:00.156130 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:42:00 crc kubenswrapper[4742]: I0317 11:42:00.156692 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:42:00 crc kubenswrapper[4742]: I0317 11:42:00.173003 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562462-8bc7k"] Mar 17 11:42:00 crc kubenswrapper[4742]: I0317 11:42:00.299670 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wnvs\" (UniqueName: \"kubernetes.io/projected/e8bc9906-5271-4a69-8fa3-e5106f062ac2-kube-api-access-6wnvs\") pod \"auto-csr-approver-29562462-8bc7k\" (UID: \"e8bc9906-5271-4a69-8fa3-e5106f062ac2\") " pod="openshift-infra/auto-csr-approver-29562462-8bc7k" Mar 17 11:42:00 crc kubenswrapper[4742]: I0317 11:42:00.402492 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wnvs\" (UniqueName: \"kubernetes.io/projected/e8bc9906-5271-4a69-8fa3-e5106f062ac2-kube-api-access-6wnvs\") pod \"auto-csr-approver-29562462-8bc7k\" (UID: \"e8bc9906-5271-4a69-8fa3-e5106f062ac2\") " pod="openshift-infra/auto-csr-approver-29562462-8bc7k" Mar 17 11:42:00 crc kubenswrapper[4742]: I0317 11:42:00.438345 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wnvs\" (UniqueName: \"kubernetes.io/projected/e8bc9906-5271-4a69-8fa3-e5106f062ac2-kube-api-access-6wnvs\") pod \"auto-csr-approver-29562462-8bc7k\" (UID: \"e8bc9906-5271-4a69-8fa3-e5106f062ac2\") " pod="openshift-infra/auto-csr-approver-29562462-8bc7k" Mar 17 11:42:00 crc kubenswrapper[4742]: I0317 11:42:00.481722 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562462-8bc7k" Mar 17 11:42:00 crc kubenswrapper[4742]: I0317 11:42:00.759823 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562462-8bc7k"] Mar 17 11:42:01 crc kubenswrapper[4742]: I0317 11:42:01.167321 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562462-8bc7k" event={"ID":"e8bc9906-5271-4a69-8fa3-e5106f062ac2","Type":"ContainerStarted","Data":"3756463dc6ba1e0a0899b28a5bbd96a1e2276ff7e3f7e4d81af39c8e9022d30f"} Mar 17 11:42:02 crc kubenswrapper[4742]: I0317 11:42:02.184823 4742 generic.go:334] "Generic (PLEG): container finished" podID="a8691841-aa32-407b-bbdc-97c5551ec591" containerID="0382835c71e7829915c75cc32af1ce54d83dc9c01266fca1da2c8afe882a0723" exitCode=0 Mar 17 11:42:02 crc kubenswrapper[4742]: I0317 11:42:02.185129 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" event={"ID":"a8691841-aa32-407b-bbdc-97c5551ec591","Type":"ContainerDied","Data":"0382835c71e7829915c75cc32af1ce54d83dc9c01266fca1da2c8afe882a0723"} Mar 17 11:42:02 crc kubenswrapper[4742]: I0317 11:42:02.188781 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562462-8bc7k" event={"ID":"e8bc9906-5271-4a69-8fa3-e5106f062ac2","Type":"ContainerStarted","Data":"b6f9d44cd7e38ad91669d5e736d3b37c406dec4a78ae39cf25b269eb0eaeefd3"} Mar 17 11:42:02 crc kubenswrapper[4742]: I0317 11:42:02.231103 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562462-8bc7k" podStartSLOduration=1.25167773 podStartE2EDuration="2.231082288s" podCreationTimestamp="2026-03-17 11:42:00 +0000 UTC" firstStartedPulling="2026-03-17 11:42:00.782962088 +0000 UTC m=+1823.909089846" lastFinishedPulling="2026-03-17 11:42:01.762366646 +0000 UTC m=+1824.888494404" observedRunningTime="2026-03-17 11:42:02.224509857 +0000 UTC m=+1825.350637635" watchObservedRunningTime="2026-03-17 11:42:02.231082288 +0000 UTC m=+1825.357210046" Mar 17 11:42:03 crc kubenswrapper[4742]: I0317 11:42:03.201471 4742 generic.go:334] "Generic (PLEG): container finished" podID="e8bc9906-5271-4a69-8fa3-e5106f062ac2" containerID="b6f9d44cd7e38ad91669d5e736d3b37c406dec4a78ae39cf25b269eb0eaeefd3" exitCode=0 Mar 17 11:42:03 crc kubenswrapper[4742]: I0317 11:42:03.201575 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562462-8bc7k" event={"ID":"e8bc9906-5271-4a69-8fa3-e5106f062ac2","Type":"ContainerDied","Data":"b6f9d44cd7e38ad91669d5e736d3b37c406dec4a78ae39cf25b269eb0eaeefd3"} Mar 17 11:42:03 crc kubenswrapper[4742]: I0317 11:42:03.718283 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" Mar 17 11:42:03 crc kubenswrapper[4742]: I0317 11:42:03.875442 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8691841-aa32-407b-bbdc-97c5551ec591-ssh-key-openstack-edpm-ipam\") pod \"a8691841-aa32-407b-bbdc-97c5551ec591\" (UID: \"a8691841-aa32-407b-bbdc-97c5551ec591\") " Mar 17 11:42:03 crc kubenswrapper[4742]: I0317 11:42:03.875861 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8691841-aa32-407b-bbdc-97c5551ec591-inventory\") pod \"a8691841-aa32-407b-bbdc-97c5551ec591\" (UID: \"a8691841-aa32-407b-bbdc-97c5551ec591\") " Mar 17 11:42:03 crc kubenswrapper[4742]: I0317 11:42:03.875892 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmj5v\" (UniqueName: \"kubernetes.io/projected/a8691841-aa32-407b-bbdc-97c5551ec591-kube-api-access-mmj5v\") pod \"a8691841-aa32-407b-bbdc-97c5551ec591\" (UID: \"a8691841-aa32-407b-bbdc-97c5551ec591\") " Mar 17 11:42:03 crc kubenswrapper[4742]: I0317 11:42:03.882174 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8691841-aa32-407b-bbdc-97c5551ec591-kube-api-access-mmj5v" (OuterVolumeSpecName: "kube-api-access-mmj5v") pod "a8691841-aa32-407b-bbdc-97c5551ec591" (UID: "a8691841-aa32-407b-bbdc-97c5551ec591"). InnerVolumeSpecName "kube-api-access-mmj5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:42:03 crc kubenswrapper[4742]: I0317 11:42:03.906969 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8691841-aa32-407b-bbdc-97c5551ec591-inventory" (OuterVolumeSpecName: "inventory") pod "a8691841-aa32-407b-bbdc-97c5551ec591" (UID: "a8691841-aa32-407b-bbdc-97c5551ec591"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:42:03 crc kubenswrapper[4742]: I0317 11:42:03.922707 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8691841-aa32-407b-bbdc-97c5551ec591-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a8691841-aa32-407b-bbdc-97c5551ec591" (UID: "a8691841-aa32-407b-bbdc-97c5551ec591"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:42:03 crc kubenswrapper[4742]: I0317 11:42:03.979277 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a8691841-aa32-407b-bbdc-97c5551ec591-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:42:03 crc kubenswrapper[4742]: I0317 11:42:03.979330 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmj5v\" (UniqueName: \"kubernetes.io/projected/a8691841-aa32-407b-bbdc-97c5551ec591-kube-api-access-mmj5v\") on node \"crc\" DevicePath \"\"" Mar 17 11:42:03 crc kubenswrapper[4742]: I0317 11:42:03.979355 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a8691841-aa32-407b-bbdc-97c5551ec591-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.216731 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.216817 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mvrds" event={"ID":"a8691841-aa32-407b-bbdc-97c5551ec591","Type":"ContainerDied","Data":"19aa26a89012a7880a99ba5ed1d440fd69ca40a56be7ad3c73baafd281cab047"} Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.216855 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19aa26a89012a7880a99ba5ed1d440fd69ca40a56be7ad3c73baafd281cab047" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.314613 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh"] Mar 17 11:42:04 crc kubenswrapper[4742]: E0317 11:42:04.315029 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8691841-aa32-407b-bbdc-97c5551ec591" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.315046 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8691841-aa32-407b-bbdc-97c5551ec591" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.315236 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8691841-aa32-407b-bbdc-97c5551ec591" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.315790 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.318368 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.319249 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.319392 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.319409 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.329976 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh"] Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.387228 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh\" (UID: \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.387299 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lshlh\" (UniqueName: \"kubernetes.io/projected/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-kube-api-access-lshlh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh\" (UID: \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.387326 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh\" (UID: \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.488946 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh\" (UID: \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.489054 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lshlh\" (UniqueName: \"kubernetes.io/projected/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-kube-api-access-lshlh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh\" (UID: \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.489084 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh\" (UID: \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.493203 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh\" (UID: \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.496770 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh\" (UID: \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.512468 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lshlh\" (UniqueName: \"kubernetes.io/projected/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-kube-api-access-lshlh\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh\" (UID: \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.553337 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562462-8bc7k" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.642929 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.692078 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wnvs\" (UniqueName: \"kubernetes.io/projected/e8bc9906-5271-4a69-8fa3-e5106f062ac2-kube-api-access-6wnvs\") pod \"e8bc9906-5271-4a69-8fa3-e5106f062ac2\" (UID: \"e8bc9906-5271-4a69-8fa3-e5106f062ac2\") " Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.701063 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8bc9906-5271-4a69-8fa3-e5106f062ac2-kube-api-access-6wnvs" (OuterVolumeSpecName: "kube-api-access-6wnvs") pod "e8bc9906-5271-4a69-8fa3-e5106f062ac2" (UID: "e8bc9906-5271-4a69-8fa3-e5106f062ac2"). InnerVolumeSpecName "kube-api-access-6wnvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:42:04 crc kubenswrapper[4742]: I0317 11:42:04.796676 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wnvs\" (UniqueName: \"kubernetes.io/projected/e8bc9906-5271-4a69-8fa3-e5106f062ac2-kube-api-access-6wnvs\") on node \"crc\" DevicePath \"\"" Mar 17 11:42:05 crc kubenswrapper[4742]: I0317 11:42:05.194660 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh"] Mar 17 11:42:05 crc kubenswrapper[4742]: W0317 11:42:05.201109 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd4b8d37_8f12_4560_b616_cbbed45a7cb2.slice/crio-bb4455f813c27aa57a18df3febb56ca61771a5d5130617a0d2ae2f73f29efee5 WatchSource:0}: Error finding container bb4455f813c27aa57a18df3febb56ca61771a5d5130617a0d2ae2f73f29efee5: Status 404 returned error can't find the container with id bb4455f813c27aa57a18df3febb56ca61771a5d5130617a0d2ae2f73f29efee5 Mar 17 11:42:05 crc kubenswrapper[4742]: I0317 11:42:05.235542 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562462-8bc7k" Mar 17 11:42:05 crc kubenswrapper[4742]: I0317 11:42:05.235557 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562462-8bc7k" event={"ID":"e8bc9906-5271-4a69-8fa3-e5106f062ac2","Type":"ContainerDied","Data":"3756463dc6ba1e0a0899b28a5bbd96a1e2276ff7e3f7e4d81af39c8e9022d30f"} Mar 17 11:42:05 crc kubenswrapper[4742]: I0317 11:42:05.235597 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3756463dc6ba1e0a0899b28a5bbd96a1e2276ff7e3f7e4d81af39c8e9022d30f" Mar 17 11:42:05 crc kubenswrapper[4742]: I0317 11:42:05.238685 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" event={"ID":"bd4b8d37-8f12-4560-b616-cbbed45a7cb2","Type":"ContainerStarted","Data":"bb4455f813c27aa57a18df3febb56ca61771a5d5130617a0d2ae2f73f29efee5"} Mar 17 11:42:05 crc kubenswrapper[4742]: I0317 11:42:05.641106 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562456-zl42f"] Mar 17 11:42:05 crc kubenswrapper[4742]: I0317 11:42:05.651505 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562456-zl42f"] Mar 17 11:42:06 crc kubenswrapper[4742]: I0317 11:42:06.043044 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-6sr2t"] Mar 17 11:42:06 crc kubenswrapper[4742]: I0317 11:42:06.055158 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-6sr2t"] Mar 17 11:42:06 crc kubenswrapper[4742]: I0317 11:42:06.257017 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" event={"ID":"bd4b8d37-8f12-4560-b616-cbbed45a7cb2","Type":"ContainerStarted","Data":"1381465963db8a53d4dcf2faa1e2ecb764ba8fceac0a265436d632f753dc62de"} Mar 17 11:42:06 crc kubenswrapper[4742]: I0317 11:42:06.283839 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" podStartSLOduration=1.53813404 podStartE2EDuration="2.283820775s" podCreationTimestamp="2026-03-17 11:42:04 +0000 UTC" firstStartedPulling="2026-03-17 11:42:05.203927835 +0000 UTC m=+1828.330055603" lastFinishedPulling="2026-03-17 11:42:05.94961458 +0000 UTC m=+1829.075742338" observedRunningTime="2026-03-17 11:42:06.281358278 +0000 UTC m=+1829.407486046" watchObservedRunningTime="2026-03-17 11:42:06.283820775 +0000 UTC m=+1829.409948523" Mar 17 11:42:06 crc kubenswrapper[4742]: I0317 11:42:06.673554 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="828f87ee-72a4-43e5-88f7-0a15975b90a5" path="/var/lib/kubelet/pods/828f87ee-72a4-43e5-88f7-0a15975b90a5/volumes" Mar 17 11:42:06 crc kubenswrapper[4742]: I0317 11:42:06.674254 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd3294f4-0a77-467f-8178-8631afe227fe" path="/var/lib/kubelet/pods/dd3294f4-0a77-467f-8178-8631afe227fe/volumes" Mar 17 11:42:07 crc kubenswrapper[4742]: I0317 11:42:07.663047 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:42:07 crc kubenswrapper[4742]: E0317 11:42:07.663573 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:42:13 crc kubenswrapper[4742]: I0317 11:42:13.049002 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-5f9b-account-create-update-7kbcm"] Mar 17 11:42:13 crc kubenswrapper[4742]: I0317 11:42:13.073671 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-dps6c"] Mar 17 11:42:13 crc kubenswrapper[4742]: I0317 11:42:13.101515 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-5f9b-account-create-update-7kbcm"] Mar 17 11:42:13 crc kubenswrapper[4742]: I0317 11:42:13.111093 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-dps6c"] Mar 17 11:42:14 crc kubenswrapper[4742]: I0317 11:42:14.686258 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e82963e-fa88-4b3c-847c-4fc0976e63b0" path="/var/lib/kubelet/pods/1e82963e-fa88-4b3c-847c-4fc0976e63b0/volumes" Mar 17 11:42:14 crc kubenswrapper[4742]: I0317 11:42:14.687737 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ef3446-d4b4-42dd-862f-0e4c548a9752" path="/var/lib/kubelet/pods/45ef3446-d4b4-42dd-862f-0e4c548a9752/volumes" Mar 17 11:42:17 crc kubenswrapper[4742]: I0317 11:42:17.046209 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-dhlk7"] Mar 17 11:42:17 crc kubenswrapper[4742]: I0317 11:42:17.061521 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-k2nnj"] Mar 17 11:42:17 crc kubenswrapper[4742]: I0317 11:42:17.074088 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-57b1-account-create-update-rgxfk"] Mar 17 11:42:17 crc kubenswrapper[4742]: I0317 11:42:17.083696 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f965-account-create-update-mfv8t"] Mar 17 11:42:17 crc kubenswrapper[4742]: I0317 11:42:17.091956 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-dhlk7"] Mar 17 11:42:17 crc kubenswrapper[4742]: I0317 11:42:17.105992 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-k2nnj"] Mar 17 11:42:17 crc kubenswrapper[4742]: I0317 11:42:17.121645 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-57b1-account-create-update-rgxfk"] Mar 17 11:42:17 crc kubenswrapper[4742]: I0317 11:42:17.132517 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f965-account-create-update-mfv8t"] Mar 17 11:42:18 crc kubenswrapper[4742]: I0317 11:42:18.680028 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="201fe27b-47b0-4b1d-89d5-fc37b565a76a" path="/var/lib/kubelet/pods/201fe27b-47b0-4b1d-89d5-fc37b565a76a/volumes" Mar 17 11:42:18 crc kubenswrapper[4742]: I0317 11:42:18.680639 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949c94b8-282b-40f0-bba5-3865562af774" path="/var/lib/kubelet/pods/949c94b8-282b-40f0-bba5-3865562af774/volumes" Mar 17 11:42:18 crc kubenswrapper[4742]: I0317 11:42:18.681183 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11a1d54-dde6-466c-a500-72fd1c349db3" path="/var/lib/kubelet/pods/d11a1d54-dde6-466c-a500-72fd1c349db3/volumes" Mar 17 11:42:18 crc kubenswrapper[4742]: I0317 11:42:18.681687 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5" path="/var/lib/kubelet/pods/f22dbf32-2c55-42e2-a7f1-3fc6c7c6edd5/volumes" Mar 17 11:42:20 crc kubenswrapper[4742]: I0317 11:42:20.663154 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:42:20 crc kubenswrapper[4742]: E0317 11:42:20.664217 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:42:22 crc kubenswrapper[4742]: I0317 11:42:22.038420 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-9srm9"] Mar 17 11:42:22 crc kubenswrapper[4742]: I0317 11:42:22.045310 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-9srm9"] Mar 17 11:42:22 crc kubenswrapper[4742]: I0317 11:42:22.678034 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="204cceda-eecf-48b2-b808-d2981ea6f0be" path="/var/lib/kubelet/pods/204cceda-eecf-48b2-b808-d2981ea6f0be/volumes" Mar 17 11:42:35 crc kubenswrapper[4742]: I0317 11:42:35.663400 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:42:35 crc kubenswrapper[4742]: E0317 11:42:35.664863 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:42:46 crc kubenswrapper[4742]: I0317 11:42:46.665552 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:42:46 crc kubenswrapper[4742]: E0317 11:42:46.666554 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:42:50 crc kubenswrapper[4742]: I0317 11:42:50.064604 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7kmxq"] Mar 17 11:42:50 crc kubenswrapper[4742]: I0317 11:42:50.074174 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7kmxq"] Mar 17 11:42:50 crc kubenswrapper[4742]: I0317 11:42:50.675639 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="603a0e75-694a-4ab5-bbe0-616f617bc949" path="/var/lib/kubelet/pods/603a0e75-694a-4ab5-bbe0-616f617bc949/volumes" Mar 17 11:42:53 crc kubenswrapper[4742]: I0317 11:42:53.698776 4742 scope.go:117] "RemoveContainer" containerID="20127a94628f13a2546341aa2e04fdef5eb4c56906bdb046295e675159e13cf0" Mar 17 11:42:53 crc kubenswrapper[4742]: I0317 11:42:53.764963 4742 scope.go:117] "RemoveContainer" containerID="42c04c90a6c9e5fa259d3167072073f029de3a0b987f1ee39c7d23b0648f2a34" Mar 17 11:42:53 crc kubenswrapper[4742]: I0317 11:42:53.808041 4742 scope.go:117] "RemoveContainer" containerID="4ef659e0fc73686bdc219e2b805c8ba4347bbc4acfb2982923291e175e6e71cc" Mar 17 11:42:53 crc kubenswrapper[4742]: I0317 11:42:53.847342 4742 scope.go:117] "RemoveContainer" containerID="9bd0a349b324af024e2ad70b8b129ae444cc251b8dc8929d55576fc49d4c0adc" Mar 17 11:42:53 crc kubenswrapper[4742]: I0317 11:42:53.886545 4742 scope.go:117] "RemoveContainer" containerID="f94bad1cff14b843d25f11d02ce6595ca41b53fd50bc7ce6eb7e828abbefcb08" Mar 17 11:42:53 crc kubenswrapper[4742]: I0317 11:42:53.926954 4742 scope.go:117] "RemoveContainer" containerID="8a4d5a7aec20b9bffbb6a7e48f746ea28834748dbe582be542f07299629eb0dc" Mar 17 11:42:53 crc kubenswrapper[4742]: I0317 11:42:53.983828 4742 scope.go:117] "RemoveContainer" containerID="5dbd54130eda4b981d57f4e9a4456c069dd650f871865dfc3219983e4cc0313b" Mar 17 11:42:54 crc kubenswrapper[4742]: I0317 11:42:54.022345 4742 scope.go:117] "RemoveContainer" containerID="2ba37516202552ef925c69fed60b69b58405874b3021fb7d38fb547f33098c55" Mar 17 11:42:54 crc kubenswrapper[4742]: I0317 11:42:54.058158 4742 scope.go:117] "RemoveContainer" containerID="5617aca000aa7b5e6311644bc6b1a093cfe8e1a8ae9ba31c7befacd5b573b699" Mar 17 11:42:54 crc kubenswrapper[4742]: I0317 11:42:54.087745 4742 scope.go:117] "RemoveContainer" containerID="27c73a6430d4129faf969c37f4087e8e8e6e9df6cb6e5b19aa3c6eef621646ff" Mar 17 11:42:54 crc kubenswrapper[4742]: I0317 11:42:54.111878 4742 scope.go:117] "RemoveContainer" containerID="5c6bfb504b711fa3c0c16f626be67dc1479ce3838de3a9caaae00e79da969e41" Mar 17 11:42:54 crc kubenswrapper[4742]: I0317 11:42:54.137003 4742 scope.go:117] "RemoveContainer" containerID="a564811358144cc61faf392366df2fe8f396b29e2f40ffb41049b88d1665bb2e" Mar 17 11:42:55 crc kubenswrapper[4742]: I0317 11:42:55.032471 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wjx6c"] Mar 17 11:42:55 crc kubenswrapper[4742]: I0317 11:42:55.043769 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-wjx6c"] Mar 17 11:42:56 crc kubenswrapper[4742]: I0317 11:42:56.037420 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-7mmzn"] Mar 17 11:42:56 crc kubenswrapper[4742]: I0317 11:42:56.053491 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-7mmzn"] Mar 17 11:42:56 crc kubenswrapper[4742]: I0317 11:42:56.679171 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23024865-2dad-4a51-af8b-7d7a224c8ce8" path="/var/lib/kubelet/pods/23024865-2dad-4a51-af8b-7d7a224c8ce8/volumes" Mar 17 11:42:56 crc kubenswrapper[4742]: I0317 11:42:56.681567 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3261b59-fc08-4596-bda8-7b398ef979e4" path="/var/lib/kubelet/pods/e3261b59-fc08-4596-bda8-7b398ef979e4/volumes" Mar 17 11:42:59 crc kubenswrapper[4742]: I0317 11:42:59.662740 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:42:59 crc kubenswrapper[4742]: E0317 11:42:59.663379 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:43:00 crc kubenswrapper[4742]: I0317 11:43:00.042491 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qf8ng"] Mar 17 11:43:00 crc kubenswrapper[4742]: I0317 11:43:00.051795 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qf8ng"] Mar 17 11:43:00 crc kubenswrapper[4742]: I0317 11:43:00.678068 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81353e8-6a78-46f3-ae59-028afb88c5ef" path="/var/lib/kubelet/pods/a81353e8-6a78-46f3-ae59-028afb88c5ef/volumes" Mar 17 11:43:12 crc kubenswrapper[4742]: I0317 11:43:12.034607 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-cs4pt"] Mar 17 11:43:12 crc kubenswrapper[4742]: I0317 11:43:12.042851 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-cs4pt"] Mar 17 11:43:12 crc kubenswrapper[4742]: I0317 11:43:12.688120 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90b52e42-6eca-4585-95a0-057055089c97" path="/var/lib/kubelet/pods/90b52e42-6eca-4585-95a0-057055089c97/volumes" Mar 17 11:43:14 crc kubenswrapper[4742]: I0317 11:43:14.042923 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-qzc74"] Mar 17 11:43:14 crc kubenswrapper[4742]: I0317 11:43:14.057054 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-qzc74"] Mar 17 11:43:14 crc kubenswrapper[4742]: I0317 11:43:14.663703 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:43:14 crc kubenswrapper[4742]: E0317 11:43:14.664614 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:43:14 crc kubenswrapper[4742]: I0317 11:43:14.682587 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c75af6d-6842-49b5-aebe-54feb0644942" path="/var/lib/kubelet/pods/5c75af6d-6842-49b5-aebe-54feb0644942/volumes" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.438235 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f2tmr"] Mar 17 11:43:19 crc kubenswrapper[4742]: E0317 11:43:19.439379 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8bc9906-5271-4a69-8fa3-e5106f062ac2" containerName="oc" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.439402 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bc9906-5271-4a69-8fa3-e5106f062ac2" containerName="oc" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.439888 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8bc9906-5271-4a69-8fa3-e5106f062ac2" containerName="oc" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.442222 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.451963 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f2tmr"] Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.556159 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-catalog-content\") pod \"community-operators-f2tmr\" (UID: \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\") " pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.556247 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-utilities\") pod \"community-operators-f2tmr\" (UID: \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\") " pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.556270 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fjm7\" (UniqueName: \"kubernetes.io/projected/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-kube-api-access-2fjm7\") pod \"community-operators-f2tmr\" (UID: \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\") " pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.658061 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-utilities\") pod \"community-operators-f2tmr\" (UID: \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\") " pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.658107 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fjm7\" (UniqueName: \"kubernetes.io/projected/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-kube-api-access-2fjm7\") pod \"community-operators-f2tmr\" (UID: \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\") " pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.658206 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-catalog-content\") pod \"community-operators-f2tmr\" (UID: \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\") " pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.658623 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-catalog-content\") pod \"community-operators-f2tmr\" (UID: \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\") " pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.658834 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-utilities\") pod \"community-operators-f2tmr\" (UID: \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\") " pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.684217 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fjm7\" (UniqueName: \"kubernetes.io/projected/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-kube-api-access-2fjm7\") pod \"community-operators-f2tmr\" (UID: \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\") " pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:19 crc kubenswrapper[4742]: I0317 11:43:19.780227 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:20 crc kubenswrapper[4742]: I0317 11:43:20.354899 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f2tmr"] Mar 17 11:43:21 crc kubenswrapper[4742]: I0317 11:43:21.036741 4742 generic.go:334] "Generic (PLEG): container finished" podID="3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" containerID="5f19b5d1c55fd11d2c47da31b4e72dbd67065d95f8328d6cd661c226d8841820" exitCode=0 Mar 17 11:43:21 crc kubenswrapper[4742]: I0317 11:43:21.037078 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2tmr" event={"ID":"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa","Type":"ContainerDied","Data":"5f19b5d1c55fd11d2c47da31b4e72dbd67065d95f8328d6cd661c226d8841820"} Mar 17 11:43:21 crc kubenswrapper[4742]: I0317 11:43:21.037112 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2tmr" event={"ID":"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa","Type":"ContainerStarted","Data":"de81e38d8f33b54616e137f8ba6e9db5c1b64a47fcf10d6c19b779cb6872df1a"} Mar 17 11:43:21 crc kubenswrapper[4742]: I0317 11:43:21.039045 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 11:43:22 crc kubenswrapper[4742]: I0317 11:43:22.046894 4742 generic.go:334] "Generic (PLEG): container finished" podID="bd4b8d37-8f12-4560-b616-cbbed45a7cb2" containerID="1381465963db8a53d4dcf2faa1e2ecb764ba8fceac0a265436d632f753dc62de" exitCode=0 Mar 17 11:43:22 crc kubenswrapper[4742]: I0317 11:43:22.046962 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" event={"ID":"bd4b8d37-8f12-4560-b616-cbbed45a7cb2","Type":"ContainerDied","Data":"1381465963db8a53d4dcf2faa1e2ecb764ba8fceac0a265436d632f753dc62de"} Mar 17 11:43:23 crc kubenswrapper[4742]: I0317 11:43:23.433178 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" Mar 17 11:43:23 crc kubenswrapper[4742]: I0317 11:43:23.448087 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-inventory\") pod \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\" (UID: \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\") " Mar 17 11:43:23 crc kubenswrapper[4742]: I0317 11:43:23.448214 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lshlh\" (UniqueName: \"kubernetes.io/projected/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-kube-api-access-lshlh\") pod \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\" (UID: \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\") " Mar 17 11:43:23 crc kubenswrapper[4742]: I0317 11:43:23.448474 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-ssh-key-openstack-edpm-ipam\") pod \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\" (UID: \"bd4b8d37-8f12-4560-b616-cbbed45a7cb2\") " Mar 17 11:43:23 crc kubenswrapper[4742]: I0317 11:43:23.458927 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-kube-api-access-lshlh" (OuterVolumeSpecName: "kube-api-access-lshlh") pod "bd4b8d37-8f12-4560-b616-cbbed45a7cb2" (UID: "bd4b8d37-8f12-4560-b616-cbbed45a7cb2"). InnerVolumeSpecName "kube-api-access-lshlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:43:23 crc kubenswrapper[4742]: I0317 11:43:23.484796 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bd4b8d37-8f12-4560-b616-cbbed45a7cb2" (UID: "bd4b8d37-8f12-4560-b616-cbbed45a7cb2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:43:23 crc kubenswrapper[4742]: I0317 11:43:23.488786 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-inventory" (OuterVolumeSpecName: "inventory") pod "bd4b8d37-8f12-4560-b616-cbbed45a7cb2" (UID: "bd4b8d37-8f12-4560-b616-cbbed45a7cb2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:43:23 crc kubenswrapper[4742]: I0317 11:43:23.557721 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:43:23 crc kubenswrapper[4742]: I0317 11:43:23.558019 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:43:23 crc kubenswrapper[4742]: I0317 11:43:23.558030 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lshlh\" (UniqueName: \"kubernetes.io/projected/bd4b8d37-8f12-4560-b616-cbbed45a7cb2-kube-api-access-lshlh\") on node \"crc\" DevicePath \"\"" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.077697 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" event={"ID":"bd4b8d37-8f12-4560-b616-cbbed45a7cb2","Type":"ContainerDied","Data":"bb4455f813c27aa57a18df3febb56ca61771a5d5130617a0d2ae2f73f29efee5"} Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.077744 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb4455f813c27aa57a18df3febb56ca61771a5d5130617a0d2ae2f73f29efee5" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.077806 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.184459 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k"] Mar 17 11:43:24 crc kubenswrapper[4742]: E0317 11:43:24.184959 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4b8d37-8f12-4560-b616-cbbed45a7cb2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.184982 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4b8d37-8f12-4560-b616-cbbed45a7cb2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.185212 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd4b8d37-8f12-4560-b616-cbbed45a7cb2" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.185950 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.188593 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.189831 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.190134 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.190307 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.192474 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k"] Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.280363 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe59da59-475f-4c7d-ab34-f3085125c224-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-th85k\" (UID: \"fe59da59-475f-4c7d-ab34-f3085125c224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.280570 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe59da59-475f-4c7d-ab34-f3085125c224-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-th85k\" (UID: \"fe59da59-475f-4c7d-ab34-f3085125c224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.280626 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k77kw\" (UniqueName: \"kubernetes.io/projected/fe59da59-475f-4c7d-ab34-f3085125c224-kube-api-access-k77kw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-th85k\" (UID: \"fe59da59-475f-4c7d-ab34-f3085125c224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.381841 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe59da59-475f-4c7d-ab34-f3085125c224-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-th85k\" (UID: \"fe59da59-475f-4c7d-ab34-f3085125c224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.381895 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k77kw\" (UniqueName: \"kubernetes.io/projected/fe59da59-475f-4c7d-ab34-f3085125c224-kube-api-access-k77kw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-th85k\" (UID: \"fe59da59-475f-4c7d-ab34-f3085125c224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.381965 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe59da59-475f-4c7d-ab34-f3085125c224-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-th85k\" (UID: \"fe59da59-475f-4c7d-ab34-f3085125c224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.386507 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe59da59-475f-4c7d-ab34-f3085125c224-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-th85k\" (UID: \"fe59da59-475f-4c7d-ab34-f3085125c224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.400298 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe59da59-475f-4c7d-ab34-f3085125c224-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-th85k\" (UID: \"fe59da59-475f-4c7d-ab34-f3085125c224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.403763 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k77kw\" (UniqueName: \"kubernetes.io/projected/fe59da59-475f-4c7d-ab34-f3085125c224-kube-api-access-k77kw\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-th85k\" (UID: \"fe59da59-475f-4c7d-ab34-f3085125c224\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" Mar 17 11:43:24 crc kubenswrapper[4742]: I0317 11:43:24.512581 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" Mar 17 11:43:26 crc kubenswrapper[4742]: I0317 11:43:26.098704 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k"] Mar 17 11:43:26 crc kubenswrapper[4742]: W0317 11:43:26.102295 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe59da59_475f_4c7d_ab34_f3085125c224.slice/crio-02519517f93edcb77bd0cb1ec527a7f4a894c4a683a227cdbd37909601bcdf3b WatchSource:0}: Error finding container 02519517f93edcb77bd0cb1ec527a7f4a894c4a683a227cdbd37909601bcdf3b: Status 404 returned error can't find the container with id 02519517f93edcb77bd0cb1ec527a7f4a894c4a683a227cdbd37909601bcdf3b Mar 17 11:43:26 crc kubenswrapper[4742]: I0317 11:43:26.102581 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2tmr" event={"ID":"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa","Type":"ContainerStarted","Data":"4d969a739e06c194141e37ad7215cd875c23d04469edd6a4ddd954975f81fa81"} Mar 17 11:43:26 crc kubenswrapper[4742]: I0317 11:43:26.665488 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:43:27 crc kubenswrapper[4742]: I0317 11:43:27.111210 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" event={"ID":"fe59da59-475f-4c7d-ab34-f3085125c224","Type":"ContainerStarted","Data":"1763fd6ad8f0957a9bdf7b6fdd1999d01aaad344bfe87d330b3365cb8228baf7"} Mar 17 11:43:27 crc kubenswrapper[4742]: I0317 11:43:27.111705 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" event={"ID":"fe59da59-475f-4c7d-ab34-f3085125c224","Type":"ContainerStarted","Data":"02519517f93edcb77bd0cb1ec527a7f4a894c4a683a227cdbd37909601bcdf3b"} Mar 17 11:43:27 crc kubenswrapper[4742]: I0317 11:43:27.116371 4742 generic.go:334] "Generic (PLEG): container finished" podID="3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" containerID="4d969a739e06c194141e37ad7215cd875c23d04469edd6a4ddd954975f81fa81" exitCode=0 Mar 17 11:43:27 crc kubenswrapper[4742]: I0317 11:43:27.116451 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2tmr" event={"ID":"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa","Type":"ContainerDied","Data":"4d969a739e06c194141e37ad7215cd875c23d04469edd6a4ddd954975f81fa81"} Mar 17 11:43:27 crc kubenswrapper[4742]: I0317 11:43:27.129290 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"f3e0af6893b2594265c0b520ca2bca430428f6f884f7c0a9258384a451ab4bae"} Mar 17 11:43:27 crc kubenswrapper[4742]: I0317 11:43:27.138793 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" podStartSLOduration=2.522547596 podStartE2EDuration="3.138771433s" podCreationTimestamp="2026-03-17 11:43:24 +0000 UTC" firstStartedPulling="2026-03-17 11:43:26.104326407 +0000 UTC m=+1909.230454165" lastFinishedPulling="2026-03-17 11:43:26.720550244 +0000 UTC m=+1909.846678002" observedRunningTime="2026-03-17 11:43:27.132093082 +0000 UTC m=+1910.258220890" watchObservedRunningTime="2026-03-17 11:43:27.138771433 +0000 UTC m=+1910.264899191" Mar 17 11:43:28 crc kubenswrapper[4742]: I0317 11:43:28.143712 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2tmr" event={"ID":"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa","Type":"ContainerStarted","Data":"a24bb3423fd6c614d478fa816e35e28e57a04f304ab6bd2257a2a2a21148e2af"} Mar 17 11:43:28 crc kubenswrapper[4742]: I0317 11:43:28.165854 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f2tmr" podStartSLOduration=2.475659819 podStartE2EDuration="9.165837711s" podCreationTimestamp="2026-03-17 11:43:19 +0000 UTC" firstStartedPulling="2026-03-17 11:43:21.038763582 +0000 UTC m=+1904.164891340" lastFinishedPulling="2026-03-17 11:43:27.728941474 +0000 UTC m=+1910.855069232" observedRunningTime="2026-03-17 11:43:28.162778813 +0000 UTC m=+1911.288906601" watchObservedRunningTime="2026-03-17 11:43:28.165837711 +0000 UTC m=+1911.291965469" Mar 17 11:43:30 crc kubenswrapper[4742]: I0317 11:43:30.098988 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:30 crc kubenswrapper[4742]: I0317 11:43:30.099179 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:31 crc kubenswrapper[4742]: I0317 11:43:31.149844 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-f2tmr" podUID="3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" containerName="registry-server" probeResult="failure" output=< Mar 17 11:43:31 crc kubenswrapper[4742]: timeout: failed to connect service ":50051" within 1s Mar 17 11:43:31 crc kubenswrapper[4742]: > Mar 17 11:43:32 crc kubenswrapper[4742]: I0317 11:43:32.703458 4742 generic.go:334] "Generic (PLEG): container finished" podID="fe59da59-475f-4c7d-ab34-f3085125c224" containerID="1763fd6ad8f0957a9bdf7b6fdd1999d01aaad344bfe87d330b3365cb8228baf7" exitCode=0 Mar 17 11:43:32 crc kubenswrapper[4742]: I0317 11:43:32.703630 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" event={"ID":"fe59da59-475f-4c7d-ab34-f3085125c224","Type":"ContainerDied","Data":"1763fd6ad8f0957a9bdf7b6fdd1999d01aaad344bfe87d330b3365cb8228baf7"} Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.069204 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.143386 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe59da59-475f-4c7d-ab34-f3085125c224-inventory\") pod \"fe59da59-475f-4c7d-ab34-f3085125c224\" (UID: \"fe59da59-475f-4c7d-ab34-f3085125c224\") " Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.143759 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe59da59-475f-4c7d-ab34-f3085125c224-ssh-key-openstack-edpm-ipam\") pod \"fe59da59-475f-4c7d-ab34-f3085125c224\" (UID: \"fe59da59-475f-4c7d-ab34-f3085125c224\") " Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.143874 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k77kw\" (UniqueName: \"kubernetes.io/projected/fe59da59-475f-4c7d-ab34-f3085125c224-kube-api-access-k77kw\") pod \"fe59da59-475f-4c7d-ab34-f3085125c224\" (UID: \"fe59da59-475f-4c7d-ab34-f3085125c224\") " Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.148891 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe59da59-475f-4c7d-ab34-f3085125c224-kube-api-access-k77kw" (OuterVolumeSpecName: "kube-api-access-k77kw") pod "fe59da59-475f-4c7d-ab34-f3085125c224" (UID: "fe59da59-475f-4c7d-ab34-f3085125c224"). InnerVolumeSpecName "kube-api-access-k77kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.170518 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe59da59-475f-4c7d-ab34-f3085125c224-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fe59da59-475f-4c7d-ab34-f3085125c224" (UID: "fe59da59-475f-4c7d-ab34-f3085125c224"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.171268 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe59da59-475f-4c7d-ab34-f3085125c224-inventory" (OuterVolumeSpecName: "inventory") pod "fe59da59-475f-4c7d-ab34-f3085125c224" (UID: "fe59da59-475f-4c7d-ab34-f3085125c224"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.246058 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe59da59-475f-4c7d-ab34-f3085125c224-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.246101 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k77kw\" (UniqueName: \"kubernetes.io/projected/fe59da59-475f-4c7d-ab34-f3085125c224-kube-api-access-k77kw\") on node \"crc\" DevicePath \"\"" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.246113 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe59da59-475f-4c7d-ab34-f3085125c224-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.720285 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" event={"ID":"fe59da59-475f-4c7d-ab34-f3085125c224","Type":"ContainerDied","Data":"02519517f93edcb77bd0cb1ec527a7f4a894c4a683a227cdbd37909601bcdf3b"} Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.720341 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-th85k" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.720361 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02519517f93edcb77bd0cb1ec527a7f4a894c4a683a227cdbd37909601bcdf3b" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.847731 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5"] Mar 17 11:43:34 crc kubenswrapper[4742]: E0317 11:43:34.848561 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe59da59-475f-4c7d-ab34-f3085125c224" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.848588 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe59da59-475f-4c7d-ab34-f3085125c224" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.849144 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe59da59-475f-4c7d-ab34-f3085125c224" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.850504 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.855510 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.855642 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.855769 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.856693 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.865478 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5"] Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.961306 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71aa9411-3abc-46dd-9907-3f2847f83866-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lk8g5\" (UID: \"71aa9411-3abc-46dd-9907-3f2847f83866\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.961421 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71aa9411-3abc-46dd-9907-3f2847f83866-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lk8g5\" (UID: \"71aa9411-3abc-46dd-9907-3f2847f83866\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" Mar 17 11:43:34 crc kubenswrapper[4742]: I0317 11:43:34.961467 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxmz4\" (UniqueName: \"kubernetes.io/projected/71aa9411-3abc-46dd-9907-3f2847f83866-kube-api-access-bxmz4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lk8g5\" (UID: \"71aa9411-3abc-46dd-9907-3f2847f83866\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" Mar 17 11:43:35 crc kubenswrapper[4742]: I0317 11:43:35.062946 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxmz4\" (UniqueName: \"kubernetes.io/projected/71aa9411-3abc-46dd-9907-3f2847f83866-kube-api-access-bxmz4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lk8g5\" (UID: \"71aa9411-3abc-46dd-9907-3f2847f83866\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" Mar 17 11:43:35 crc kubenswrapper[4742]: I0317 11:43:35.063376 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71aa9411-3abc-46dd-9907-3f2847f83866-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lk8g5\" (UID: \"71aa9411-3abc-46dd-9907-3f2847f83866\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" Mar 17 11:43:35 crc kubenswrapper[4742]: I0317 11:43:35.063445 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71aa9411-3abc-46dd-9907-3f2847f83866-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lk8g5\" (UID: \"71aa9411-3abc-46dd-9907-3f2847f83866\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" Mar 17 11:43:35 crc kubenswrapper[4742]: I0317 11:43:35.068609 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71aa9411-3abc-46dd-9907-3f2847f83866-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lk8g5\" (UID: \"71aa9411-3abc-46dd-9907-3f2847f83866\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" Mar 17 11:43:35 crc kubenswrapper[4742]: I0317 11:43:35.069512 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71aa9411-3abc-46dd-9907-3f2847f83866-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lk8g5\" (UID: \"71aa9411-3abc-46dd-9907-3f2847f83866\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" Mar 17 11:43:35 crc kubenswrapper[4742]: I0317 11:43:35.090386 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxmz4\" (UniqueName: \"kubernetes.io/projected/71aa9411-3abc-46dd-9907-3f2847f83866-kube-api-access-bxmz4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lk8g5\" (UID: \"71aa9411-3abc-46dd-9907-3f2847f83866\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" Mar 17 11:43:35 crc kubenswrapper[4742]: I0317 11:43:35.184785 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" Mar 17 11:43:35 crc kubenswrapper[4742]: W0317 11:43:35.657946 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71aa9411_3abc_46dd_9907_3f2847f83866.slice/crio-582c689f17f10e68460d303fcc8d7ebefaf473d26288c19378141fa1d3205276 WatchSource:0}: Error finding container 582c689f17f10e68460d303fcc8d7ebefaf473d26288c19378141fa1d3205276: Status 404 returned error can't find the container with id 582c689f17f10e68460d303fcc8d7ebefaf473d26288c19378141fa1d3205276 Mar 17 11:43:35 crc kubenswrapper[4742]: I0317 11:43:35.659390 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5"] Mar 17 11:43:35 crc kubenswrapper[4742]: I0317 11:43:35.729160 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" event={"ID":"71aa9411-3abc-46dd-9907-3f2847f83866","Type":"ContainerStarted","Data":"582c689f17f10e68460d303fcc8d7ebefaf473d26288c19378141fa1d3205276"} Mar 17 11:43:36 crc kubenswrapper[4742]: I0317 11:43:36.739631 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" event={"ID":"71aa9411-3abc-46dd-9907-3f2847f83866","Type":"ContainerStarted","Data":"ae8d593093b587e1b8da82bb2924d387b2a4bafd6cc48193aab73acb4f1e5450"} Mar 17 11:43:36 crc kubenswrapper[4742]: I0317 11:43:36.764613 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" podStartSLOduration=1.984776697 podStartE2EDuration="2.764595761s" podCreationTimestamp="2026-03-17 11:43:34 +0000 UTC" firstStartedPulling="2026-03-17 11:43:35.66025308 +0000 UTC m=+1918.786380838" lastFinishedPulling="2026-03-17 11:43:36.440072144 +0000 UTC m=+1919.566199902" observedRunningTime="2026-03-17 11:43:36.757827488 +0000 UTC m=+1919.883955246" watchObservedRunningTime="2026-03-17 11:43:36.764595761 +0000 UTC m=+1919.890723519" Mar 17 11:43:39 crc kubenswrapper[4742]: I0317 11:43:39.858858 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:39 crc kubenswrapper[4742]: I0317 11:43:39.936925 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f2tmr" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.038888 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f2tmr"] Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.105685 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vnvhp"] Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.105985 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vnvhp" podUID="5b587550-1bc7-4b07-a980-39f876baec8f" containerName="registry-server" containerID="cri-o://5f1fe5e0096af1d3d10bf091f7b3ab14912740705d0dd65f4cba83e381922946" gracePeriod=2 Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.619587 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.683569 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdgd2\" (UniqueName: \"kubernetes.io/projected/5b587550-1bc7-4b07-a980-39f876baec8f-kube-api-access-qdgd2\") pod \"5b587550-1bc7-4b07-a980-39f876baec8f\" (UID: \"5b587550-1bc7-4b07-a980-39f876baec8f\") " Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.683683 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b587550-1bc7-4b07-a980-39f876baec8f-catalog-content\") pod \"5b587550-1bc7-4b07-a980-39f876baec8f\" (UID: \"5b587550-1bc7-4b07-a980-39f876baec8f\") " Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.683796 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b587550-1bc7-4b07-a980-39f876baec8f-utilities\") pod \"5b587550-1bc7-4b07-a980-39f876baec8f\" (UID: \"5b587550-1bc7-4b07-a980-39f876baec8f\") " Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.684566 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b587550-1bc7-4b07-a980-39f876baec8f-utilities" (OuterVolumeSpecName: "utilities") pod "5b587550-1bc7-4b07-a980-39f876baec8f" (UID: "5b587550-1bc7-4b07-a980-39f876baec8f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.727074 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b587550-1bc7-4b07-a980-39f876baec8f-kube-api-access-qdgd2" (OuterVolumeSpecName: "kube-api-access-qdgd2") pod "5b587550-1bc7-4b07-a980-39f876baec8f" (UID: "5b587550-1bc7-4b07-a980-39f876baec8f"). InnerVolumeSpecName "kube-api-access-qdgd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.778990 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b587550-1bc7-4b07-a980-39f876baec8f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b587550-1bc7-4b07-a980-39f876baec8f" (UID: "5b587550-1bc7-4b07-a980-39f876baec8f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.779837 4742 generic.go:334] "Generic (PLEG): container finished" podID="5b587550-1bc7-4b07-a980-39f876baec8f" containerID="5f1fe5e0096af1d3d10bf091f7b3ab14912740705d0dd65f4cba83e381922946" exitCode=0 Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.780012 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnvhp" event={"ID":"5b587550-1bc7-4b07-a980-39f876baec8f","Type":"ContainerDied","Data":"5f1fe5e0096af1d3d10bf091f7b3ab14912740705d0dd65f4cba83e381922946"} Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.780060 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnvhp" event={"ID":"5b587550-1bc7-4b07-a980-39f876baec8f","Type":"ContainerDied","Data":"3c2b8dfeff7941f3bebbd462a92dfb224c305517810c94052373e17980f8e91e"} Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.780082 4742 scope.go:117] "RemoveContainer" containerID="5f1fe5e0096af1d3d10bf091f7b3ab14912740705d0dd65f4cba83e381922946" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.780144 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnvhp" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.785533 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdgd2\" (UniqueName: \"kubernetes.io/projected/5b587550-1bc7-4b07-a980-39f876baec8f-kube-api-access-qdgd2\") on node \"crc\" DevicePath \"\"" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.786246 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b587550-1bc7-4b07-a980-39f876baec8f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.786266 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b587550-1bc7-4b07-a980-39f876baec8f-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.809568 4742 scope.go:117] "RemoveContainer" containerID="f2754896cf30f6ae224faf876b8ee131ba6ecb139c1f9bc3967fc8b1744f879e" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.813844 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vnvhp"] Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.822764 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vnvhp"] Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.829848 4742 scope.go:117] "RemoveContainer" containerID="501807a2e7f165e013e1b9af07f52f04b2cfef3778b399df487ba77ddb0cbeea" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.865045 4742 scope.go:117] "RemoveContainer" containerID="5f1fe5e0096af1d3d10bf091f7b3ab14912740705d0dd65f4cba83e381922946" Mar 17 11:43:40 crc kubenswrapper[4742]: E0317 11:43:40.865685 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f1fe5e0096af1d3d10bf091f7b3ab14912740705d0dd65f4cba83e381922946\": container with ID starting with 5f1fe5e0096af1d3d10bf091f7b3ab14912740705d0dd65f4cba83e381922946 not found: ID does not exist" containerID="5f1fe5e0096af1d3d10bf091f7b3ab14912740705d0dd65f4cba83e381922946" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.865788 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f1fe5e0096af1d3d10bf091f7b3ab14912740705d0dd65f4cba83e381922946"} err="failed to get container status \"5f1fe5e0096af1d3d10bf091f7b3ab14912740705d0dd65f4cba83e381922946\": rpc error: code = NotFound desc = could not find container \"5f1fe5e0096af1d3d10bf091f7b3ab14912740705d0dd65f4cba83e381922946\": container with ID starting with 5f1fe5e0096af1d3d10bf091f7b3ab14912740705d0dd65f4cba83e381922946 not found: ID does not exist" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.865819 4742 scope.go:117] "RemoveContainer" containerID="f2754896cf30f6ae224faf876b8ee131ba6ecb139c1f9bc3967fc8b1744f879e" Mar 17 11:43:40 crc kubenswrapper[4742]: E0317 11:43:40.866433 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2754896cf30f6ae224faf876b8ee131ba6ecb139c1f9bc3967fc8b1744f879e\": container with ID starting with f2754896cf30f6ae224faf876b8ee131ba6ecb139c1f9bc3967fc8b1744f879e not found: ID does not exist" containerID="f2754896cf30f6ae224faf876b8ee131ba6ecb139c1f9bc3967fc8b1744f879e" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.866458 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2754896cf30f6ae224faf876b8ee131ba6ecb139c1f9bc3967fc8b1744f879e"} err="failed to get container status \"f2754896cf30f6ae224faf876b8ee131ba6ecb139c1f9bc3967fc8b1744f879e\": rpc error: code = NotFound desc = could not find container \"f2754896cf30f6ae224faf876b8ee131ba6ecb139c1f9bc3967fc8b1744f879e\": container with ID starting with f2754896cf30f6ae224faf876b8ee131ba6ecb139c1f9bc3967fc8b1744f879e not found: ID does not exist" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.866472 4742 scope.go:117] "RemoveContainer" containerID="501807a2e7f165e013e1b9af07f52f04b2cfef3778b399df487ba77ddb0cbeea" Mar 17 11:43:40 crc kubenswrapper[4742]: E0317 11:43:40.866766 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"501807a2e7f165e013e1b9af07f52f04b2cfef3778b399df487ba77ddb0cbeea\": container with ID starting with 501807a2e7f165e013e1b9af07f52f04b2cfef3778b399df487ba77ddb0cbeea not found: ID does not exist" containerID="501807a2e7f165e013e1b9af07f52f04b2cfef3778b399df487ba77ddb0cbeea" Mar 17 11:43:40 crc kubenswrapper[4742]: I0317 11:43:40.866787 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"501807a2e7f165e013e1b9af07f52f04b2cfef3778b399df487ba77ddb0cbeea"} err="failed to get container status \"501807a2e7f165e013e1b9af07f52f04b2cfef3778b399df487ba77ddb0cbeea\": rpc error: code = NotFound desc = could not find container \"501807a2e7f165e013e1b9af07f52f04b2cfef3778b399df487ba77ddb0cbeea\": container with ID starting with 501807a2e7f165e013e1b9af07f52f04b2cfef3778b399df487ba77ddb0cbeea not found: ID does not exist" Mar 17 11:43:42 crc kubenswrapper[4742]: I0317 11:43:42.673809 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b587550-1bc7-4b07-a980-39f876baec8f" path="/var/lib/kubelet/pods/5b587550-1bc7-4b07-a980-39f876baec8f/volumes" Mar 17 11:43:43 crc kubenswrapper[4742]: I0317 11:43:43.066161 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-nx2qq"] Mar 17 11:43:43 crc kubenswrapper[4742]: I0317 11:43:43.086836 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-nx2qq"] Mar 17 11:43:43 crc kubenswrapper[4742]: I0317 11:43:43.098643 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-a76f-account-create-update-9nrf6"] Mar 17 11:43:43 crc kubenswrapper[4742]: I0317 11:43:43.107893 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-lsskf"] Mar 17 11:43:43 crc kubenswrapper[4742]: I0317 11:43:43.114889 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-a76f-account-create-update-9nrf6"] Mar 17 11:43:43 crc kubenswrapper[4742]: I0317 11:43:43.121556 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-vcls7"] Mar 17 11:43:43 crc kubenswrapper[4742]: I0317 11:43:43.128032 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-66c1-account-create-update-98hrl"] Mar 17 11:43:43 crc kubenswrapper[4742]: I0317 11:43:43.134537 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-9a1d-account-create-update-fk254"] Mar 17 11:43:43 crc kubenswrapper[4742]: I0317 11:43:43.140591 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-66c1-account-create-update-98hrl"] Mar 17 11:43:43 crc kubenswrapper[4742]: I0317 11:43:43.146502 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-lsskf"] Mar 17 11:43:43 crc kubenswrapper[4742]: I0317 11:43:43.153463 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-vcls7"] Mar 17 11:43:43 crc kubenswrapper[4742]: I0317 11:43:43.159349 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-9a1d-account-create-update-fk254"] Mar 17 11:43:44 crc kubenswrapper[4742]: I0317 11:43:44.683208 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="191435c1-53cc-4df8-97de-1c71c78d9595" path="/var/lib/kubelet/pods/191435c1-53cc-4df8-97de-1c71c78d9595/volumes" Mar 17 11:43:44 crc kubenswrapper[4742]: I0317 11:43:44.685093 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fce8ab7-2019-4d76-a2b5-003b5489bd87" path="/var/lib/kubelet/pods/1fce8ab7-2019-4d76-a2b5-003b5489bd87/volumes" Mar 17 11:43:44 crc kubenswrapper[4742]: I0317 11:43:44.686561 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c138c21-ff11-48af-9745-1e11b6b11467" path="/var/lib/kubelet/pods/2c138c21-ff11-48af-9745-1e11b6b11467/volumes" Mar 17 11:43:44 crc kubenswrapper[4742]: I0317 11:43:44.687840 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="536f4bea-32b3-4fd4-a576-b73a67d7ad23" path="/var/lib/kubelet/pods/536f4bea-32b3-4fd4-a576-b73a67d7ad23/volumes" Mar 17 11:43:44 crc kubenswrapper[4742]: I0317 11:43:44.690417 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be226402-dc63-46e4-a635-670382b29013" path="/var/lib/kubelet/pods/be226402-dc63-46e4-a635-670382b29013/volumes" Mar 17 11:43:44 crc kubenswrapper[4742]: I0317 11:43:44.692133 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5e4c0a2-6132-410b-8740-6c9e171c7824" path="/var/lib/kubelet/pods/f5e4c0a2-6132-410b-8740-6c9e171c7824/volumes" Mar 17 11:43:54 crc kubenswrapper[4742]: I0317 11:43:54.394702 4742 scope.go:117] "RemoveContainer" containerID="cb6c5cccbe986bd1e0e45082009a082d04c991d8ad6d2ed9bb320e5975d56e31" Mar 17 11:43:54 crc kubenswrapper[4742]: I0317 11:43:54.448617 4742 scope.go:117] "RemoveContainer" containerID="4ff039c5f059f386c06e8e1cfdbffd1e2a65257ed35a387c1224c90134df023a" Mar 17 11:43:54 crc kubenswrapper[4742]: I0317 11:43:54.502316 4742 scope.go:117] "RemoveContainer" containerID="0aa271424e124473fa5166acc53c483de31ee97380a2e1b5a179eee85102ec11" Mar 17 11:43:54 crc kubenswrapper[4742]: I0317 11:43:54.542427 4742 scope.go:117] "RemoveContainer" containerID="88168536c116dbbdd920a498b31d10d9503492eb277131566703e79519b7835c" Mar 17 11:43:54 crc kubenswrapper[4742]: I0317 11:43:54.572811 4742 scope.go:117] "RemoveContainer" containerID="ed201381f67965df77d8cbea154884269fa05a627a0bd50f69090277b724b3ed" Mar 17 11:43:54 crc kubenswrapper[4742]: I0317 11:43:54.615615 4742 scope.go:117] "RemoveContainer" containerID="2d20c9a98cfb3056390dacb40419181cfa85f282a737dfffb38d5c2e64021e7c" Mar 17 11:43:54 crc kubenswrapper[4742]: I0317 11:43:54.648509 4742 scope.go:117] "RemoveContainer" containerID="726ccc09aac7a7dfce0c397456d3768ad3833d3e4b1325cf1dfe826d69747455" Mar 17 11:43:54 crc kubenswrapper[4742]: I0317 11:43:54.681697 4742 scope.go:117] "RemoveContainer" containerID="7c6c0d952f237a67f8d73cd492a5849ea114b6e55c0029547f2fcaedaa3a4ab4" Mar 17 11:43:54 crc kubenswrapper[4742]: I0317 11:43:54.708588 4742 scope.go:117] "RemoveContainer" containerID="ac3e76307d9ac2538e6ae31da71f62ad326944c8bae3edefc48f083629472d1c" Mar 17 11:43:54 crc kubenswrapper[4742]: I0317 11:43:54.730604 4742 scope.go:117] "RemoveContainer" containerID="09bfd82acac1b94c1f73f7ca0c85c42677c88cfd225aac247dc1317905cfc092" Mar 17 11:43:54 crc kubenswrapper[4742]: I0317 11:43:54.761202 4742 scope.go:117] "RemoveContainer" containerID="a048e1a32911037a0dbd5f184d943a049e436c92cf0945a43419112df79b53df" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.148881 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562464-mnq7n"] Mar 17 11:44:00 crc kubenswrapper[4742]: E0317 11:44:00.149712 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b587550-1bc7-4b07-a980-39f876baec8f" containerName="registry-server" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.149726 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b587550-1bc7-4b07-a980-39f876baec8f" containerName="registry-server" Mar 17 11:44:00 crc kubenswrapper[4742]: E0317 11:44:00.149753 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b587550-1bc7-4b07-a980-39f876baec8f" containerName="extract-content" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.149761 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b587550-1bc7-4b07-a980-39f876baec8f" containerName="extract-content" Mar 17 11:44:00 crc kubenswrapper[4742]: E0317 11:44:00.149779 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b587550-1bc7-4b07-a980-39f876baec8f" containerName="extract-utilities" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.149784 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b587550-1bc7-4b07-a980-39f876baec8f" containerName="extract-utilities" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.149955 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b587550-1bc7-4b07-a980-39f876baec8f" containerName="registry-server" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.150512 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562464-mnq7n" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.153018 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.153181 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.154299 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.173036 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562464-mnq7n"] Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.195951 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtx4b\" (UniqueName: \"kubernetes.io/projected/0236d66a-1c05-4000-928f-449316a872d2-kube-api-access-jtx4b\") pod \"auto-csr-approver-29562464-mnq7n\" (UID: \"0236d66a-1c05-4000-928f-449316a872d2\") " pod="openshift-infra/auto-csr-approver-29562464-mnq7n" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.297951 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtx4b\" (UniqueName: \"kubernetes.io/projected/0236d66a-1c05-4000-928f-449316a872d2-kube-api-access-jtx4b\") pod \"auto-csr-approver-29562464-mnq7n\" (UID: \"0236d66a-1c05-4000-928f-449316a872d2\") " pod="openshift-infra/auto-csr-approver-29562464-mnq7n" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.325222 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtx4b\" (UniqueName: \"kubernetes.io/projected/0236d66a-1c05-4000-928f-449316a872d2-kube-api-access-jtx4b\") pod \"auto-csr-approver-29562464-mnq7n\" (UID: \"0236d66a-1c05-4000-928f-449316a872d2\") " pod="openshift-infra/auto-csr-approver-29562464-mnq7n" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.480956 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562464-mnq7n" Mar 17 11:44:00 crc kubenswrapper[4742]: I0317 11:44:00.996174 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562464-mnq7n"] Mar 17 11:44:02 crc kubenswrapper[4742]: I0317 11:44:02.006012 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562464-mnq7n" event={"ID":"0236d66a-1c05-4000-928f-449316a872d2","Type":"ContainerStarted","Data":"1188a48a29d1f8d359b37dc734336fcac59b46cf1b6d912d4f6921ae2c127bf3"} Mar 17 11:44:03 crc kubenswrapper[4742]: I0317 11:44:03.017853 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562464-mnq7n" event={"ID":"0236d66a-1c05-4000-928f-449316a872d2","Type":"ContainerStarted","Data":"2ae2c8b474ef4c683309b76ef58c23b812c57753185be69f98636e91ab4d4390"} Mar 17 11:44:03 crc kubenswrapper[4742]: I0317 11:44:03.037337 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562464-mnq7n" podStartSLOduration=1.371156182 podStartE2EDuration="3.037317349s" podCreationTimestamp="2026-03-17 11:44:00 +0000 UTC" firstStartedPulling="2026-03-17 11:44:01.002460937 +0000 UTC m=+1944.128588725" lastFinishedPulling="2026-03-17 11:44:02.668622114 +0000 UTC m=+1945.794749892" observedRunningTime="2026-03-17 11:44:03.032051595 +0000 UTC m=+1946.158179393" watchObservedRunningTime="2026-03-17 11:44:03.037317349 +0000 UTC m=+1946.163445107" Mar 17 11:44:04 crc kubenswrapper[4742]: I0317 11:44:04.031442 4742 generic.go:334] "Generic (PLEG): container finished" podID="0236d66a-1c05-4000-928f-449316a872d2" containerID="2ae2c8b474ef4c683309b76ef58c23b812c57753185be69f98636e91ab4d4390" exitCode=0 Mar 17 11:44:04 crc kubenswrapper[4742]: I0317 11:44:04.031483 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562464-mnq7n" event={"ID":"0236d66a-1c05-4000-928f-449316a872d2","Type":"ContainerDied","Data":"2ae2c8b474ef4c683309b76ef58c23b812c57753185be69f98636e91ab4d4390"} Mar 17 11:44:05 crc kubenswrapper[4742]: I0317 11:44:05.438309 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562464-mnq7n" Mar 17 11:44:05 crc kubenswrapper[4742]: I0317 11:44:05.555051 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtx4b\" (UniqueName: \"kubernetes.io/projected/0236d66a-1c05-4000-928f-449316a872d2-kube-api-access-jtx4b\") pod \"0236d66a-1c05-4000-928f-449316a872d2\" (UID: \"0236d66a-1c05-4000-928f-449316a872d2\") " Mar 17 11:44:05 crc kubenswrapper[4742]: I0317 11:44:05.560603 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0236d66a-1c05-4000-928f-449316a872d2-kube-api-access-jtx4b" (OuterVolumeSpecName: "kube-api-access-jtx4b") pod "0236d66a-1c05-4000-928f-449316a872d2" (UID: "0236d66a-1c05-4000-928f-449316a872d2"). InnerVolumeSpecName "kube-api-access-jtx4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:44:05 crc kubenswrapper[4742]: I0317 11:44:05.656622 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtx4b\" (UniqueName: \"kubernetes.io/projected/0236d66a-1c05-4000-928f-449316a872d2-kube-api-access-jtx4b\") on node \"crc\" DevicePath \"\"" Mar 17 11:44:06 crc kubenswrapper[4742]: I0317 11:44:06.063188 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562464-mnq7n" event={"ID":"0236d66a-1c05-4000-928f-449316a872d2","Type":"ContainerDied","Data":"1188a48a29d1f8d359b37dc734336fcac59b46cf1b6d912d4f6921ae2c127bf3"} Mar 17 11:44:06 crc kubenswrapper[4742]: I0317 11:44:06.063507 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1188a48a29d1f8d359b37dc734336fcac59b46cf1b6d912d4f6921ae2c127bf3" Mar 17 11:44:06 crc kubenswrapper[4742]: I0317 11:44:06.063296 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562464-mnq7n" Mar 17 11:44:06 crc kubenswrapper[4742]: I0317 11:44:06.130947 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562458-dphx7"] Mar 17 11:44:06 crc kubenswrapper[4742]: I0317 11:44:06.147783 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562458-dphx7"] Mar 17 11:44:06 crc kubenswrapper[4742]: I0317 11:44:06.678368 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9fe8c0-2ce7-4fae-b112-a9778e1cee38" path="/var/lib/kubelet/pods/3a9fe8c0-2ce7-4fae-b112-a9778e1cee38/volumes" Mar 17 11:44:15 crc kubenswrapper[4742]: I0317 11:44:15.158741 4742 generic.go:334] "Generic (PLEG): container finished" podID="71aa9411-3abc-46dd-9907-3f2847f83866" containerID="ae8d593093b587e1b8da82bb2924d387b2a4bafd6cc48193aab73acb4f1e5450" exitCode=0 Mar 17 11:44:15 crc kubenswrapper[4742]: I0317 11:44:15.158858 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" event={"ID":"71aa9411-3abc-46dd-9907-3f2847f83866","Type":"ContainerDied","Data":"ae8d593093b587e1b8da82bb2924d387b2a4bafd6cc48193aab73acb4f1e5450"} Mar 17 11:44:16 crc kubenswrapper[4742]: I0317 11:44:16.037466 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bq6vr"] Mar 17 11:44:16 crc kubenswrapper[4742]: I0317 11:44:16.044443 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-bq6vr"] Mar 17 11:44:16 crc kubenswrapper[4742]: I0317 11:44:16.710353 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e15fe5ee-73d7-415a-a61c-a0e67d085f3a" path="/var/lib/kubelet/pods/e15fe5ee-73d7-415a-a61c-a0e67d085f3a/volumes" Mar 17 11:44:16 crc kubenswrapper[4742]: I0317 11:44:16.756611 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" Mar 17 11:44:16 crc kubenswrapper[4742]: I0317 11:44:16.832721 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71aa9411-3abc-46dd-9907-3f2847f83866-inventory\") pod \"71aa9411-3abc-46dd-9907-3f2847f83866\" (UID: \"71aa9411-3abc-46dd-9907-3f2847f83866\") " Mar 17 11:44:16 crc kubenswrapper[4742]: I0317 11:44:16.833026 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxmz4\" (UniqueName: \"kubernetes.io/projected/71aa9411-3abc-46dd-9907-3f2847f83866-kube-api-access-bxmz4\") pod \"71aa9411-3abc-46dd-9907-3f2847f83866\" (UID: \"71aa9411-3abc-46dd-9907-3f2847f83866\") " Mar 17 11:44:16 crc kubenswrapper[4742]: I0317 11:44:16.833256 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71aa9411-3abc-46dd-9907-3f2847f83866-ssh-key-openstack-edpm-ipam\") pod \"71aa9411-3abc-46dd-9907-3f2847f83866\" (UID: \"71aa9411-3abc-46dd-9907-3f2847f83866\") " Mar 17 11:44:16 crc kubenswrapper[4742]: I0317 11:44:16.837889 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71aa9411-3abc-46dd-9907-3f2847f83866-kube-api-access-bxmz4" (OuterVolumeSpecName: "kube-api-access-bxmz4") pod "71aa9411-3abc-46dd-9907-3f2847f83866" (UID: "71aa9411-3abc-46dd-9907-3f2847f83866"). InnerVolumeSpecName "kube-api-access-bxmz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:44:16 crc kubenswrapper[4742]: I0317 11:44:16.857705 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71aa9411-3abc-46dd-9907-3f2847f83866-inventory" (OuterVolumeSpecName: "inventory") pod "71aa9411-3abc-46dd-9907-3f2847f83866" (UID: "71aa9411-3abc-46dd-9907-3f2847f83866"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:44:16 crc kubenswrapper[4742]: I0317 11:44:16.860182 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71aa9411-3abc-46dd-9907-3f2847f83866-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "71aa9411-3abc-46dd-9907-3f2847f83866" (UID: "71aa9411-3abc-46dd-9907-3f2847f83866"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:44:16 crc kubenswrapper[4742]: I0317 11:44:16.934553 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71aa9411-3abc-46dd-9907-3f2847f83866-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:44:16 crc kubenswrapper[4742]: I0317 11:44:16.934606 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxmz4\" (UniqueName: \"kubernetes.io/projected/71aa9411-3abc-46dd-9907-3f2847f83866-kube-api-access-bxmz4\") on node \"crc\" DevicePath \"\"" Mar 17 11:44:16 crc kubenswrapper[4742]: I0317 11:44:16.934616 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71aa9411-3abc-46dd-9907-3f2847f83866-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.180941 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" event={"ID":"71aa9411-3abc-46dd-9907-3f2847f83866","Type":"ContainerDied","Data":"582c689f17f10e68460d303fcc8d7ebefaf473d26288c19378141fa1d3205276"} Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.181370 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="582c689f17f10e68460d303fcc8d7ebefaf473d26288c19378141fa1d3205276" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.181560 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lk8g5" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.282464 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv"] Mar 17 11:44:17 crc kubenswrapper[4742]: E0317 11:44:17.282945 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71aa9411-3abc-46dd-9907-3f2847f83866" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.282965 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="71aa9411-3abc-46dd-9907-3f2847f83866" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 17 11:44:17 crc kubenswrapper[4742]: E0317 11:44:17.282981 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0236d66a-1c05-4000-928f-449316a872d2" containerName="oc" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.282990 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0236d66a-1c05-4000-928f-449316a872d2" containerName="oc" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.283234 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="0236d66a-1c05-4000-928f-449316a872d2" containerName="oc" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.283287 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="71aa9411-3abc-46dd-9907-3f2847f83866" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.284075 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.288351 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.288417 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.288478 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.290528 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.292392 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv"] Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.341127 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8h5b\" (UniqueName: \"kubernetes.io/projected/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-kube-api-access-k8h5b\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv\" (UID: \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.341277 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv\" (UID: \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.341300 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv\" (UID: \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.442401 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv\" (UID: \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.442447 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv\" (UID: \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.442497 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8h5b\" (UniqueName: \"kubernetes.io/projected/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-kube-api-access-k8h5b\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv\" (UID: \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.446440 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv\" (UID: \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.446881 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv\" (UID: \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.461443 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8h5b\" (UniqueName: \"kubernetes.io/projected/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-kube-api-access-k8h5b\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv\" (UID: \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" Mar 17 11:44:17 crc kubenswrapper[4742]: I0317 11:44:17.607212 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" Mar 17 11:44:18 crc kubenswrapper[4742]: I0317 11:44:18.217575 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv"] Mar 17 11:44:19 crc kubenswrapper[4742]: I0317 11:44:19.199708 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" event={"ID":"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4","Type":"ContainerStarted","Data":"368df11184f250041df22e29b1c4ae6cb5218ef0d2ac9734cbae35a391a57315"} Mar 17 11:44:19 crc kubenswrapper[4742]: I0317 11:44:19.200075 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" event={"ID":"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4","Type":"ContainerStarted","Data":"97ef6d15a7c8ec7cf19420e1165f4f17d26eebde63207707c87078c8cf1707bd"} Mar 17 11:44:19 crc kubenswrapper[4742]: I0317 11:44:19.215459 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" podStartSLOduration=1.672973405 podStartE2EDuration="2.215435368s" podCreationTimestamp="2026-03-17 11:44:17 +0000 UTC" firstStartedPulling="2026-03-17 11:44:18.216604182 +0000 UTC m=+1961.342731940" lastFinishedPulling="2026-03-17 11:44:18.759066135 +0000 UTC m=+1961.885193903" observedRunningTime="2026-03-17 11:44:19.214790422 +0000 UTC m=+1962.340918190" watchObservedRunningTime="2026-03-17 11:44:19.215435368 +0000 UTC m=+1962.341563146" Mar 17 11:44:45 crc kubenswrapper[4742]: I0317 11:44:45.049856 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mdzj6"] Mar 17 11:44:45 crc kubenswrapper[4742]: I0317 11:44:45.058080 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mdzj6"] Mar 17 11:44:46 crc kubenswrapper[4742]: I0317 11:44:46.034599 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2xsng"] Mar 17 11:44:46 crc kubenswrapper[4742]: I0317 11:44:46.044072 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2xsng"] Mar 17 11:44:46 crc kubenswrapper[4742]: I0317 11:44:46.688536 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3bf0544-e60c-4ca7-b535-7d43244a766b" path="/var/lib/kubelet/pods/a3bf0544-e60c-4ca7-b535-7d43244a766b/volumes" Mar 17 11:44:46 crc kubenswrapper[4742]: I0317 11:44:46.689892 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9" path="/var/lib/kubelet/pods/ba4702f6-9538-41c9-b1fe-f31f4ef9f4c9/volumes" Mar 17 11:44:55 crc kubenswrapper[4742]: I0317 11:44:55.002202 4742 scope.go:117] "RemoveContainer" containerID="d91483f1c643cbf53ffaab71a766e519070f4b7ff3910bee60a43a9b148efa77" Mar 17 11:44:55 crc kubenswrapper[4742]: I0317 11:44:55.061744 4742 scope.go:117] "RemoveContainer" containerID="49c102ea2f25979bc529898327d689137acdb1c3e0ef759e8b4da71e4736f9aa" Mar 17 11:44:55 crc kubenswrapper[4742]: I0317 11:44:55.121650 4742 scope.go:117] "RemoveContainer" containerID="2e36d6941e4aed6d67d8c74f4fa6ab620b2b6d0eaadcd77a0c001881f7b87bfd" Mar 17 11:44:55 crc kubenswrapper[4742]: I0317 11:44:55.173245 4742 scope.go:117] "RemoveContainer" containerID="05ab3e8b616f46616a6aed4d636187b891aebc6db5276ac080318e5c8b5f1902" Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.159883 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz"] Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.162465 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.165949 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.166017 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.178150 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz"] Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.318522 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/265deefd-0bb9-41f3-b68d-5ffd620b2575-secret-volume\") pod \"collect-profiles-29562465-9nfvz\" (UID: \"265deefd-0bb9-41f3-b68d-5ffd620b2575\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.319035 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxpbr\" (UniqueName: \"kubernetes.io/projected/265deefd-0bb9-41f3-b68d-5ffd620b2575-kube-api-access-wxpbr\") pod \"collect-profiles-29562465-9nfvz\" (UID: \"265deefd-0bb9-41f3-b68d-5ffd620b2575\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.319173 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/265deefd-0bb9-41f3-b68d-5ffd620b2575-config-volume\") pod \"collect-profiles-29562465-9nfvz\" (UID: \"265deefd-0bb9-41f3-b68d-5ffd620b2575\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.420603 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/265deefd-0bb9-41f3-b68d-5ffd620b2575-config-volume\") pod \"collect-profiles-29562465-9nfvz\" (UID: \"265deefd-0bb9-41f3-b68d-5ffd620b2575\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.420735 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/265deefd-0bb9-41f3-b68d-5ffd620b2575-secret-volume\") pod \"collect-profiles-29562465-9nfvz\" (UID: \"265deefd-0bb9-41f3-b68d-5ffd620b2575\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.420828 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxpbr\" (UniqueName: \"kubernetes.io/projected/265deefd-0bb9-41f3-b68d-5ffd620b2575-kube-api-access-wxpbr\") pod \"collect-profiles-29562465-9nfvz\" (UID: \"265deefd-0bb9-41f3-b68d-5ffd620b2575\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.422317 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/265deefd-0bb9-41f3-b68d-5ffd620b2575-config-volume\") pod \"collect-profiles-29562465-9nfvz\" (UID: \"265deefd-0bb9-41f3-b68d-5ffd620b2575\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.434993 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/265deefd-0bb9-41f3-b68d-5ffd620b2575-secret-volume\") pod \"collect-profiles-29562465-9nfvz\" (UID: \"265deefd-0bb9-41f3-b68d-5ffd620b2575\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.455623 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxpbr\" (UniqueName: \"kubernetes.io/projected/265deefd-0bb9-41f3-b68d-5ffd620b2575-kube-api-access-wxpbr\") pod \"collect-profiles-29562465-9nfvz\" (UID: \"265deefd-0bb9-41f3-b68d-5ffd620b2575\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" Mar 17 11:45:00 crc kubenswrapper[4742]: I0317 11:45:00.500361 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" Mar 17 11:45:01 crc kubenswrapper[4742]: I0317 11:45:01.054398 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz"] Mar 17 11:45:01 crc kubenswrapper[4742]: I0317 11:45:01.873631 4742 generic.go:334] "Generic (PLEG): container finished" podID="265deefd-0bb9-41f3-b68d-5ffd620b2575" containerID="e4d018aa9b739cc45f849887eecae91159c71a89ec3da6416581033df67289c2" exitCode=0 Mar 17 11:45:01 crc kubenswrapper[4742]: I0317 11:45:01.873719 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" event={"ID":"265deefd-0bb9-41f3-b68d-5ffd620b2575","Type":"ContainerDied","Data":"e4d018aa9b739cc45f849887eecae91159c71a89ec3da6416581033df67289c2"} Mar 17 11:45:01 crc kubenswrapper[4742]: I0317 11:45:01.873758 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" event={"ID":"265deefd-0bb9-41f3-b68d-5ffd620b2575","Type":"ContainerStarted","Data":"100139d086630db0a189ec06ce8513fddcd3ca3e0c7ee628c87c5b92e934ad6d"} Mar 17 11:45:03 crc kubenswrapper[4742]: I0317 11:45:03.303838 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" Mar 17 11:45:03 crc kubenswrapper[4742]: I0317 11:45:03.493688 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxpbr\" (UniqueName: \"kubernetes.io/projected/265deefd-0bb9-41f3-b68d-5ffd620b2575-kube-api-access-wxpbr\") pod \"265deefd-0bb9-41f3-b68d-5ffd620b2575\" (UID: \"265deefd-0bb9-41f3-b68d-5ffd620b2575\") " Mar 17 11:45:03 crc kubenswrapper[4742]: I0317 11:45:03.493741 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/265deefd-0bb9-41f3-b68d-5ffd620b2575-config-volume\") pod \"265deefd-0bb9-41f3-b68d-5ffd620b2575\" (UID: \"265deefd-0bb9-41f3-b68d-5ffd620b2575\") " Mar 17 11:45:03 crc kubenswrapper[4742]: I0317 11:45:03.493763 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/265deefd-0bb9-41f3-b68d-5ffd620b2575-secret-volume\") pod \"265deefd-0bb9-41f3-b68d-5ffd620b2575\" (UID: \"265deefd-0bb9-41f3-b68d-5ffd620b2575\") " Mar 17 11:45:03 crc kubenswrapper[4742]: I0317 11:45:03.494653 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/265deefd-0bb9-41f3-b68d-5ffd620b2575-config-volume" (OuterVolumeSpecName: "config-volume") pod "265deefd-0bb9-41f3-b68d-5ffd620b2575" (UID: "265deefd-0bb9-41f3-b68d-5ffd620b2575"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:45:03 crc kubenswrapper[4742]: I0317 11:45:03.501077 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265deefd-0bb9-41f3-b68d-5ffd620b2575-kube-api-access-wxpbr" (OuterVolumeSpecName: "kube-api-access-wxpbr") pod "265deefd-0bb9-41f3-b68d-5ffd620b2575" (UID: "265deefd-0bb9-41f3-b68d-5ffd620b2575"). InnerVolumeSpecName "kube-api-access-wxpbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:45:03 crc kubenswrapper[4742]: I0317 11:45:03.509348 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265deefd-0bb9-41f3-b68d-5ffd620b2575-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "265deefd-0bb9-41f3-b68d-5ffd620b2575" (UID: "265deefd-0bb9-41f3-b68d-5ffd620b2575"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:45:03 crc kubenswrapper[4742]: I0317 11:45:03.595825 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxpbr\" (UniqueName: \"kubernetes.io/projected/265deefd-0bb9-41f3-b68d-5ffd620b2575-kube-api-access-wxpbr\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:03 crc kubenswrapper[4742]: I0317 11:45:03.595868 4742 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/265deefd-0bb9-41f3-b68d-5ffd620b2575-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:03 crc kubenswrapper[4742]: I0317 11:45:03.595885 4742 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/265deefd-0bb9-41f3-b68d-5ffd620b2575-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:03 crc kubenswrapper[4742]: I0317 11:45:03.906903 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" event={"ID":"265deefd-0bb9-41f3-b68d-5ffd620b2575","Type":"ContainerDied","Data":"100139d086630db0a189ec06ce8513fddcd3ca3e0c7ee628c87c5b92e934ad6d"} Mar 17 11:45:03 crc kubenswrapper[4742]: I0317 11:45:03.907639 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="100139d086630db0a189ec06ce8513fddcd3ca3e0c7ee628c87c5b92e934ad6d" Mar 17 11:45:03 crc kubenswrapper[4742]: I0317 11:45:03.907196 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562465-9nfvz" Mar 17 11:45:09 crc kubenswrapper[4742]: I0317 11:45:09.964960 4742 generic.go:334] "Generic (PLEG): container finished" podID="bfb05f67-f7aa-480f-a4e9-3f24ee2102d4" containerID="368df11184f250041df22e29b1c4ae6cb5218ef0d2ac9734cbae35a391a57315" exitCode=0 Mar 17 11:45:09 crc kubenswrapper[4742]: I0317 11:45:09.965054 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" event={"ID":"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4","Type":"ContainerDied","Data":"368df11184f250041df22e29b1c4ae6cb5218ef0d2ac9734cbae35a391a57315"} Mar 17 11:45:11 crc kubenswrapper[4742]: I0317 11:45:11.418230 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" Mar 17 11:45:11 crc kubenswrapper[4742]: I0317 11:45:11.557591 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-ssh-key-openstack-edpm-ipam\") pod \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\" (UID: \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\") " Mar 17 11:45:11 crc kubenswrapper[4742]: I0317 11:45:11.557835 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-inventory\") pod \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\" (UID: \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\") " Mar 17 11:45:11 crc kubenswrapper[4742]: I0317 11:45:11.557922 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8h5b\" (UniqueName: \"kubernetes.io/projected/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-kube-api-access-k8h5b\") pod \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\" (UID: \"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4\") " Mar 17 11:45:11 crc kubenswrapper[4742]: I0317 11:45:11.563269 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-kube-api-access-k8h5b" (OuterVolumeSpecName: "kube-api-access-k8h5b") pod "bfb05f67-f7aa-480f-a4e9-3f24ee2102d4" (UID: "bfb05f67-f7aa-480f-a4e9-3f24ee2102d4"). InnerVolumeSpecName "kube-api-access-k8h5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:45:11 crc kubenswrapper[4742]: I0317 11:45:11.585655 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bfb05f67-f7aa-480f-a4e9-3f24ee2102d4" (UID: "bfb05f67-f7aa-480f-a4e9-3f24ee2102d4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:45:11 crc kubenswrapper[4742]: I0317 11:45:11.590558 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-inventory" (OuterVolumeSpecName: "inventory") pod "bfb05f67-f7aa-480f-a4e9-3f24ee2102d4" (UID: "bfb05f67-f7aa-480f-a4e9-3f24ee2102d4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:45:11 crc kubenswrapper[4742]: I0317 11:45:11.660540 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:11 crc kubenswrapper[4742]: I0317 11:45:11.660580 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8h5b\" (UniqueName: \"kubernetes.io/projected/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-kube-api-access-k8h5b\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:11 crc kubenswrapper[4742]: I0317 11:45:11.660595 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfb05f67-f7aa-480f-a4e9-3f24ee2102d4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:11 crc kubenswrapper[4742]: I0317 11:45:11.982249 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" event={"ID":"bfb05f67-f7aa-480f-a4e9-3f24ee2102d4","Type":"ContainerDied","Data":"97ef6d15a7c8ec7cf19420e1165f4f17d26eebde63207707c87078c8cf1707bd"} Mar 17 11:45:11 crc kubenswrapper[4742]: I0317 11:45:11.982287 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97ef6d15a7c8ec7cf19420e1165f4f17d26eebde63207707c87078c8cf1707bd" Mar 17 11:45:11 crc kubenswrapper[4742]: I0317 11:45:11.982309 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.064451 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p4fvs"] Mar 17 11:45:12 crc kubenswrapper[4742]: E0317 11:45:12.064887 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265deefd-0bb9-41f3-b68d-5ffd620b2575" containerName="collect-profiles" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.064924 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="265deefd-0bb9-41f3-b68d-5ffd620b2575" containerName="collect-profiles" Mar 17 11:45:12 crc kubenswrapper[4742]: E0317 11:45:12.064942 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb05f67-f7aa-480f-a4e9-3f24ee2102d4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.064952 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb05f67-f7aa-480f-a4e9-3f24ee2102d4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.065179 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="265deefd-0bb9-41f3-b68d-5ffd620b2575" containerName="collect-profiles" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.065207 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb05f67-f7aa-480f-a4e9-3f24ee2102d4" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.067097 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.070683 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.071160 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.071591 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.071597 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.076805 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p4fvs"] Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.168356 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ba90bc1a-0e57-455d-8594-4e11b1548097-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p4fvs\" (UID: \"ba90bc1a-0e57-455d-8594-4e11b1548097\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.168515 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tjpp\" (UniqueName: \"kubernetes.io/projected/ba90bc1a-0e57-455d-8594-4e11b1548097-kube-api-access-6tjpp\") pod \"ssh-known-hosts-edpm-deployment-p4fvs\" (UID: \"ba90bc1a-0e57-455d-8594-4e11b1548097\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.168697 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ba90bc1a-0e57-455d-8594-4e11b1548097-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p4fvs\" (UID: \"ba90bc1a-0e57-455d-8594-4e11b1548097\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.270428 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tjpp\" (UniqueName: \"kubernetes.io/projected/ba90bc1a-0e57-455d-8594-4e11b1548097-kube-api-access-6tjpp\") pod \"ssh-known-hosts-edpm-deployment-p4fvs\" (UID: \"ba90bc1a-0e57-455d-8594-4e11b1548097\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.270649 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ba90bc1a-0e57-455d-8594-4e11b1548097-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p4fvs\" (UID: \"ba90bc1a-0e57-455d-8594-4e11b1548097\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.270792 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ba90bc1a-0e57-455d-8594-4e11b1548097-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p4fvs\" (UID: \"ba90bc1a-0e57-455d-8594-4e11b1548097\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.274982 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ba90bc1a-0e57-455d-8594-4e11b1548097-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-p4fvs\" (UID: \"ba90bc1a-0e57-455d-8594-4e11b1548097\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.280590 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ba90bc1a-0e57-455d-8594-4e11b1548097-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-p4fvs\" (UID: \"ba90bc1a-0e57-455d-8594-4e11b1548097\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.303534 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tjpp\" (UniqueName: \"kubernetes.io/projected/ba90bc1a-0e57-455d-8594-4e11b1548097-kube-api-access-6tjpp\") pod \"ssh-known-hosts-edpm-deployment-p4fvs\" (UID: \"ba90bc1a-0e57-455d-8594-4e11b1548097\") " pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" Mar 17 11:45:12 crc kubenswrapper[4742]: I0317 11:45:12.417612 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" Mar 17 11:45:13 crc kubenswrapper[4742]: I0317 11:45:13.011280 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-p4fvs"] Mar 17 11:45:14 crc kubenswrapper[4742]: I0317 11:45:14.006826 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" event={"ID":"ba90bc1a-0e57-455d-8594-4e11b1548097","Type":"ContainerStarted","Data":"007c5c11237591b5bd5acf10ea5761dafab8e6a08300b41aa151ecb669ae001a"} Mar 17 11:45:14 crc kubenswrapper[4742]: I0317 11:45:14.007204 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" event={"ID":"ba90bc1a-0e57-455d-8594-4e11b1548097","Type":"ContainerStarted","Data":"3461489ce3a193a223b61046c059423563f556e338ed89bfac28f332803fefc2"} Mar 17 11:45:14 crc kubenswrapper[4742]: I0317 11:45:14.033437 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" podStartSLOduration=1.377240651 podStartE2EDuration="2.033417137s" podCreationTimestamp="2026-03-17 11:45:12 +0000 UTC" firstStartedPulling="2026-03-17 11:45:13.02243788 +0000 UTC m=+2016.148565638" lastFinishedPulling="2026-03-17 11:45:13.678614336 +0000 UTC m=+2016.804742124" observedRunningTime="2026-03-17 11:45:14.024315274 +0000 UTC m=+2017.150443032" watchObservedRunningTime="2026-03-17 11:45:14.033417137 +0000 UTC m=+2017.159544905" Mar 17 11:45:21 crc kubenswrapper[4742]: I0317 11:45:21.080160 4742 generic.go:334] "Generic (PLEG): container finished" podID="ba90bc1a-0e57-455d-8594-4e11b1548097" containerID="007c5c11237591b5bd5acf10ea5761dafab8e6a08300b41aa151ecb669ae001a" exitCode=0 Mar 17 11:45:21 crc kubenswrapper[4742]: I0317 11:45:21.080236 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" event={"ID":"ba90bc1a-0e57-455d-8594-4e11b1548097","Type":"ContainerDied","Data":"007c5c11237591b5bd5acf10ea5761dafab8e6a08300b41aa151ecb669ae001a"} Mar 17 11:45:22 crc kubenswrapper[4742]: I0317 11:45:22.560699 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" Mar 17 11:45:22 crc kubenswrapper[4742]: I0317 11:45:22.700174 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ba90bc1a-0e57-455d-8594-4e11b1548097-ssh-key-openstack-edpm-ipam\") pod \"ba90bc1a-0e57-455d-8594-4e11b1548097\" (UID: \"ba90bc1a-0e57-455d-8594-4e11b1548097\") " Mar 17 11:45:22 crc kubenswrapper[4742]: I0317 11:45:22.700227 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ba90bc1a-0e57-455d-8594-4e11b1548097-inventory-0\") pod \"ba90bc1a-0e57-455d-8594-4e11b1548097\" (UID: \"ba90bc1a-0e57-455d-8594-4e11b1548097\") " Mar 17 11:45:22 crc kubenswrapper[4742]: I0317 11:45:22.700373 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tjpp\" (UniqueName: \"kubernetes.io/projected/ba90bc1a-0e57-455d-8594-4e11b1548097-kube-api-access-6tjpp\") pod \"ba90bc1a-0e57-455d-8594-4e11b1548097\" (UID: \"ba90bc1a-0e57-455d-8594-4e11b1548097\") " Mar 17 11:45:22 crc kubenswrapper[4742]: I0317 11:45:22.705509 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba90bc1a-0e57-455d-8594-4e11b1548097-kube-api-access-6tjpp" (OuterVolumeSpecName: "kube-api-access-6tjpp") pod "ba90bc1a-0e57-455d-8594-4e11b1548097" (UID: "ba90bc1a-0e57-455d-8594-4e11b1548097"). InnerVolumeSpecName "kube-api-access-6tjpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:45:22 crc kubenswrapper[4742]: I0317 11:45:22.725209 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba90bc1a-0e57-455d-8594-4e11b1548097-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "ba90bc1a-0e57-455d-8594-4e11b1548097" (UID: "ba90bc1a-0e57-455d-8594-4e11b1548097"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:45:22 crc kubenswrapper[4742]: I0317 11:45:22.726610 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba90bc1a-0e57-455d-8594-4e11b1548097-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ba90bc1a-0e57-455d-8594-4e11b1548097" (UID: "ba90bc1a-0e57-455d-8594-4e11b1548097"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:45:22 crc kubenswrapper[4742]: I0317 11:45:22.802110 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ba90bc1a-0e57-455d-8594-4e11b1548097-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:22 crc kubenswrapper[4742]: I0317 11:45:22.802136 4742 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ba90bc1a-0e57-455d-8594-4e11b1548097-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:22 crc kubenswrapper[4742]: I0317 11:45:22.802146 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tjpp\" (UniqueName: \"kubernetes.io/projected/ba90bc1a-0e57-455d-8594-4e11b1548097-kube-api-access-6tjpp\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.101640 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" event={"ID":"ba90bc1a-0e57-455d-8594-4e11b1548097","Type":"ContainerDied","Data":"3461489ce3a193a223b61046c059423563f556e338ed89bfac28f332803fefc2"} Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.101686 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-p4fvs" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.101701 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3461489ce3a193a223b61046c059423563f556e338ed89bfac28f332803fefc2" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.187966 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n"] Mar 17 11:45:23 crc kubenswrapper[4742]: E0317 11:45:23.188380 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba90bc1a-0e57-455d-8594-4e11b1548097" containerName="ssh-known-hosts-edpm-deployment" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.188400 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba90bc1a-0e57-455d-8594-4e11b1548097" containerName="ssh-known-hosts-edpm-deployment" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.188648 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba90bc1a-0e57-455d-8594-4e11b1548097" containerName="ssh-known-hosts-edpm-deployment" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.189363 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.193223 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.193655 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.193987 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.195563 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.200988 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n"] Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.313718 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j75p8\" (UniqueName: \"kubernetes.io/projected/2c1f61c9-540b-4044-ba34-2bb110401fa0-kube-api-access-j75p8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7mg2n\" (UID: \"2c1f61c9-540b-4044-ba34-2bb110401fa0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.313892 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c1f61c9-540b-4044-ba34-2bb110401fa0-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7mg2n\" (UID: \"2c1f61c9-540b-4044-ba34-2bb110401fa0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.314047 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c1f61c9-540b-4044-ba34-2bb110401fa0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7mg2n\" (UID: \"2c1f61c9-540b-4044-ba34-2bb110401fa0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.415814 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j75p8\" (UniqueName: \"kubernetes.io/projected/2c1f61c9-540b-4044-ba34-2bb110401fa0-kube-api-access-j75p8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7mg2n\" (UID: \"2c1f61c9-540b-4044-ba34-2bb110401fa0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.416460 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c1f61c9-540b-4044-ba34-2bb110401fa0-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7mg2n\" (UID: \"2c1f61c9-540b-4044-ba34-2bb110401fa0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.416706 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c1f61c9-540b-4044-ba34-2bb110401fa0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7mg2n\" (UID: \"2c1f61c9-540b-4044-ba34-2bb110401fa0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.422517 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c1f61c9-540b-4044-ba34-2bb110401fa0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7mg2n\" (UID: \"2c1f61c9-540b-4044-ba34-2bb110401fa0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.424595 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c1f61c9-540b-4044-ba34-2bb110401fa0-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7mg2n\" (UID: \"2c1f61c9-540b-4044-ba34-2bb110401fa0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.441045 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j75p8\" (UniqueName: \"kubernetes.io/projected/2c1f61c9-540b-4044-ba34-2bb110401fa0-kube-api-access-j75p8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7mg2n\" (UID: \"2c1f61c9-540b-4044-ba34-2bb110401fa0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" Mar 17 11:45:23 crc kubenswrapper[4742]: I0317 11:45:23.529050 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" Mar 17 11:45:24 crc kubenswrapper[4742]: I0317 11:45:24.032106 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n"] Mar 17 11:45:24 crc kubenswrapper[4742]: I0317 11:45:24.112321 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" event={"ID":"2c1f61c9-540b-4044-ba34-2bb110401fa0","Type":"ContainerStarted","Data":"0af40a934beb391b07c53554e059fe80f26f033cc4c0a4f5cf7f4e01ed9f239c"} Mar 17 11:45:25 crc kubenswrapper[4742]: I0317 11:45:25.125866 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" event={"ID":"2c1f61c9-540b-4044-ba34-2bb110401fa0","Type":"ContainerStarted","Data":"5d91edee80771c380089fb879cd4f78ef8796dd2da8bccfcd8318d3b63c5d363"} Mar 17 11:45:25 crc kubenswrapper[4742]: I0317 11:45:25.151683 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" podStartSLOduration=1.648560509 podStartE2EDuration="2.151661586s" podCreationTimestamp="2026-03-17 11:45:23 +0000 UTC" firstStartedPulling="2026-03-17 11:45:24.044285877 +0000 UTC m=+2027.170413675" lastFinishedPulling="2026-03-17 11:45:24.547386964 +0000 UTC m=+2027.673514752" observedRunningTime="2026-03-17 11:45:25.14754726 +0000 UTC m=+2028.273675028" watchObservedRunningTime="2026-03-17 11:45:25.151661586 +0000 UTC m=+2028.277789344" Mar 17 11:45:30 crc kubenswrapper[4742]: I0317 11:45:30.058214 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-zt8wj"] Mar 17 11:45:30 crc kubenswrapper[4742]: I0317 11:45:30.071188 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-zt8wj"] Mar 17 11:45:30 crc kubenswrapper[4742]: I0317 11:45:30.672212 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb" path="/var/lib/kubelet/pods/17a1029b-b8c6-4f7e-95dc-ff9b6d5f32eb/volumes" Mar 17 11:45:33 crc kubenswrapper[4742]: I0317 11:45:33.214257 4742 generic.go:334] "Generic (PLEG): container finished" podID="2c1f61c9-540b-4044-ba34-2bb110401fa0" containerID="5d91edee80771c380089fb879cd4f78ef8796dd2da8bccfcd8318d3b63c5d363" exitCode=0 Mar 17 11:45:33 crc kubenswrapper[4742]: I0317 11:45:33.214395 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" event={"ID":"2c1f61c9-540b-4044-ba34-2bb110401fa0","Type":"ContainerDied","Data":"5d91edee80771c380089fb879cd4f78ef8796dd2da8bccfcd8318d3b63c5d363"} Mar 17 11:45:34 crc kubenswrapper[4742]: I0317 11:45:34.736340 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" Mar 17 11:45:34 crc kubenswrapper[4742]: I0317 11:45:34.783324 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c1f61c9-540b-4044-ba34-2bb110401fa0-inventory\") pod \"2c1f61c9-540b-4044-ba34-2bb110401fa0\" (UID: \"2c1f61c9-540b-4044-ba34-2bb110401fa0\") " Mar 17 11:45:34 crc kubenswrapper[4742]: I0317 11:45:34.783678 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c1f61c9-540b-4044-ba34-2bb110401fa0-ssh-key-openstack-edpm-ipam\") pod \"2c1f61c9-540b-4044-ba34-2bb110401fa0\" (UID: \"2c1f61c9-540b-4044-ba34-2bb110401fa0\") " Mar 17 11:45:34 crc kubenswrapper[4742]: I0317 11:45:34.783984 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j75p8\" (UniqueName: \"kubernetes.io/projected/2c1f61c9-540b-4044-ba34-2bb110401fa0-kube-api-access-j75p8\") pod \"2c1f61c9-540b-4044-ba34-2bb110401fa0\" (UID: \"2c1f61c9-540b-4044-ba34-2bb110401fa0\") " Mar 17 11:45:34 crc kubenswrapper[4742]: I0317 11:45:34.794006 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c1f61c9-540b-4044-ba34-2bb110401fa0-kube-api-access-j75p8" (OuterVolumeSpecName: "kube-api-access-j75p8") pod "2c1f61c9-540b-4044-ba34-2bb110401fa0" (UID: "2c1f61c9-540b-4044-ba34-2bb110401fa0"). InnerVolumeSpecName "kube-api-access-j75p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:45:34 crc kubenswrapper[4742]: I0317 11:45:34.810671 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1f61c9-540b-4044-ba34-2bb110401fa0-inventory" (OuterVolumeSpecName: "inventory") pod "2c1f61c9-540b-4044-ba34-2bb110401fa0" (UID: "2c1f61c9-540b-4044-ba34-2bb110401fa0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:45:34 crc kubenswrapper[4742]: I0317 11:45:34.836483 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1f61c9-540b-4044-ba34-2bb110401fa0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2c1f61c9-540b-4044-ba34-2bb110401fa0" (UID: "2c1f61c9-540b-4044-ba34-2bb110401fa0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:45:34 crc kubenswrapper[4742]: I0317 11:45:34.886791 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j75p8\" (UniqueName: \"kubernetes.io/projected/2c1f61c9-540b-4044-ba34-2bb110401fa0-kube-api-access-j75p8\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:34 crc kubenswrapper[4742]: I0317 11:45:34.887178 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c1f61c9-540b-4044-ba34-2bb110401fa0-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:34 crc kubenswrapper[4742]: I0317 11:45:34.887195 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2c1f61c9-540b-4044-ba34-2bb110401fa0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.239482 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" event={"ID":"2c1f61c9-540b-4044-ba34-2bb110401fa0","Type":"ContainerDied","Data":"0af40a934beb391b07c53554e059fe80f26f033cc4c0a4f5cf7f4e01ed9f239c"} Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.239561 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0af40a934beb391b07c53554e059fe80f26f033cc4c0a4f5cf7f4e01ed9f239c" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.239589 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7mg2n" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.343557 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p"] Mar 17 11:45:35 crc kubenswrapper[4742]: E0317 11:45:35.343945 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c1f61c9-540b-4044-ba34-2bb110401fa0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.343962 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c1f61c9-540b-4044-ba34-2bb110401fa0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.344136 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c1f61c9-540b-4044-ba34-2bb110401fa0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.344695 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.358974 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p"] Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.360191 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.360399 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.360424 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.360477 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.398607 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa52e3ae-e09a-4561-990a-59358b9b17b6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p\" (UID: \"aa52e3ae-e09a-4561-990a-59358b9b17b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.398690 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dczwm\" (UniqueName: \"kubernetes.io/projected/aa52e3ae-e09a-4561-990a-59358b9b17b6-kube-api-access-dczwm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p\" (UID: \"aa52e3ae-e09a-4561-990a-59358b9b17b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.398848 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa52e3ae-e09a-4561-990a-59358b9b17b6-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p\" (UID: \"aa52e3ae-e09a-4561-990a-59358b9b17b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.501226 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dczwm\" (UniqueName: \"kubernetes.io/projected/aa52e3ae-e09a-4561-990a-59358b9b17b6-kube-api-access-dczwm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p\" (UID: \"aa52e3ae-e09a-4561-990a-59358b9b17b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.501311 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa52e3ae-e09a-4561-990a-59358b9b17b6-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p\" (UID: \"aa52e3ae-e09a-4561-990a-59358b9b17b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.501475 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa52e3ae-e09a-4561-990a-59358b9b17b6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p\" (UID: \"aa52e3ae-e09a-4561-990a-59358b9b17b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.506122 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa52e3ae-e09a-4561-990a-59358b9b17b6-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p\" (UID: \"aa52e3ae-e09a-4561-990a-59358b9b17b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.508597 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa52e3ae-e09a-4561-990a-59358b9b17b6-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p\" (UID: \"aa52e3ae-e09a-4561-990a-59358b9b17b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.531109 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dczwm\" (UniqueName: \"kubernetes.io/projected/aa52e3ae-e09a-4561-990a-59358b9b17b6-kube-api-access-dczwm\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p\" (UID: \"aa52e3ae-e09a-4561-990a-59358b9b17b6\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" Mar 17 11:45:35 crc kubenswrapper[4742]: I0317 11:45:35.672141 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" Mar 17 11:45:36 crc kubenswrapper[4742]: I0317 11:45:36.239170 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p"] Mar 17 11:45:36 crc kubenswrapper[4742]: I0317 11:45:36.252128 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" event={"ID":"aa52e3ae-e09a-4561-990a-59358b9b17b6","Type":"ContainerStarted","Data":"8f36daf696b55c39c66740c1da4d9d6efa9ba76c90034b6378286b7ac2295909"} Mar 17 11:45:37 crc kubenswrapper[4742]: I0317 11:45:37.264740 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" event={"ID":"aa52e3ae-e09a-4561-990a-59358b9b17b6","Type":"ContainerStarted","Data":"dd0491a1a89c3df45ca275c7cf963e83b2fcecfa5cd4687c79e58837934af667"} Mar 17 11:45:37 crc kubenswrapper[4742]: I0317 11:45:37.285133 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" podStartSLOduration=1.804527047 podStartE2EDuration="2.285113629s" podCreationTimestamp="2026-03-17 11:45:35 +0000 UTC" firstStartedPulling="2026-03-17 11:45:36.240796402 +0000 UTC m=+2039.366924170" lastFinishedPulling="2026-03-17 11:45:36.721382994 +0000 UTC m=+2039.847510752" observedRunningTime="2026-03-17 11:45:37.282428091 +0000 UTC m=+2040.408555849" watchObservedRunningTime="2026-03-17 11:45:37.285113629 +0000 UTC m=+2040.411241397" Mar 17 11:45:47 crc kubenswrapper[4742]: I0317 11:45:47.376981 4742 generic.go:334] "Generic (PLEG): container finished" podID="aa52e3ae-e09a-4561-990a-59358b9b17b6" containerID="dd0491a1a89c3df45ca275c7cf963e83b2fcecfa5cd4687c79e58837934af667" exitCode=0 Mar 17 11:45:47 crc kubenswrapper[4742]: I0317 11:45:47.377062 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" event={"ID":"aa52e3ae-e09a-4561-990a-59358b9b17b6","Type":"ContainerDied","Data":"dd0491a1a89c3df45ca275c7cf963e83b2fcecfa5cd4687c79e58837934af667"} Mar 17 11:45:48 crc kubenswrapper[4742]: I0317 11:45:48.044967 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:45:48 crc kubenswrapper[4742]: I0317 11:45:48.045418 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:45:48 crc kubenswrapper[4742]: I0317 11:45:48.898457 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" Mar 17 11:45:48 crc kubenswrapper[4742]: I0317 11:45:48.930745 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa52e3ae-e09a-4561-990a-59358b9b17b6-ssh-key-openstack-edpm-ipam\") pod \"aa52e3ae-e09a-4561-990a-59358b9b17b6\" (UID: \"aa52e3ae-e09a-4561-990a-59358b9b17b6\") " Mar 17 11:45:48 crc kubenswrapper[4742]: I0317 11:45:48.931157 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dczwm\" (UniqueName: \"kubernetes.io/projected/aa52e3ae-e09a-4561-990a-59358b9b17b6-kube-api-access-dczwm\") pod \"aa52e3ae-e09a-4561-990a-59358b9b17b6\" (UID: \"aa52e3ae-e09a-4561-990a-59358b9b17b6\") " Mar 17 11:45:48 crc kubenswrapper[4742]: I0317 11:45:48.931310 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa52e3ae-e09a-4561-990a-59358b9b17b6-inventory\") pod \"aa52e3ae-e09a-4561-990a-59358b9b17b6\" (UID: \"aa52e3ae-e09a-4561-990a-59358b9b17b6\") " Mar 17 11:45:48 crc kubenswrapper[4742]: I0317 11:45:48.939234 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa52e3ae-e09a-4561-990a-59358b9b17b6-kube-api-access-dczwm" (OuterVolumeSpecName: "kube-api-access-dczwm") pod "aa52e3ae-e09a-4561-990a-59358b9b17b6" (UID: "aa52e3ae-e09a-4561-990a-59358b9b17b6"). InnerVolumeSpecName "kube-api-access-dczwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:45:48 crc kubenswrapper[4742]: I0317 11:45:48.967964 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa52e3ae-e09a-4561-990a-59358b9b17b6-inventory" (OuterVolumeSpecName: "inventory") pod "aa52e3ae-e09a-4561-990a-59358b9b17b6" (UID: "aa52e3ae-e09a-4561-990a-59358b9b17b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:45:48 crc kubenswrapper[4742]: I0317 11:45:48.982247 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa52e3ae-e09a-4561-990a-59358b9b17b6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "aa52e3ae-e09a-4561-990a-59358b9b17b6" (UID: "aa52e3ae-e09a-4561-990a-59358b9b17b6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.034393 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dczwm\" (UniqueName: \"kubernetes.io/projected/aa52e3ae-e09a-4561-990a-59358b9b17b6-kube-api-access-dczwm\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.034583 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa52e3ae-e09a-4561-990a-59358b9b17b6-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.034643 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa52e3ae-e09a-4561-990a-59358b9b17b6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.404614 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" event={"ID":"aa52e3ae-e09a-4561-990a-59358b9b17b6","Type":"ContainerDied","Data":"8f36daf696b55c39c66740c1da4d9d6efa9ba76c90034b6378286b7ac2295909"} Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.404654 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f36daf696b55c39c66740c1da4d9d6efa9ba76c90034b6378286b7ac2295909" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.404698 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.497225 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k"] Mar 17 11:45:49 crc kubenswrapper[4742]: E0317 11:45:49.497748 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa52e3ae-e09a-4561-990a-59358b9b17b6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.497767 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa52e3ae-e09a-4561-990a-59358b9b17b6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.498007 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa52e3ae-e09a-4561-990a-59358b9b17b6" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.498802 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.504367 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.504540 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.504735 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.504807 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.504970 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.505012 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.506971 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.507424 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k"] Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.510020 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.545416 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.545473 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.545499 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.545526 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.545796 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.545924 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.545953 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsrc2\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-kube-api-access-xsrc2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.545997 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.546028 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.546062 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.546092 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.546181 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.546209 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.546237 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.647959 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.648012 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsrc2\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-kube-api-access-xsrc2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.648062 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.648087 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.648121 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.648159 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.648225 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.648382 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.648420 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.648456 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.648506 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.648532 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.648555 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.648693 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.653934 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.653945 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.654135 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.654539 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.655803 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.655969 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.655972 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.656079 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.657424 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.657511 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.658814 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.659670 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.660410 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.670191 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsrc2\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-kube-api-access-xsrc2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-6g95k\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:49 crc kubenswrapper[4742]: I0317 11:45:49.817161 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:45:50 crc kubenswrapper[4742]: I0317 11:45:50.357671 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k"] Mar 17 11:45:50 crc kubenswrapper[4742]: I0317 11:45:50.434795 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" event={"ID":"62491de6-4c04-49d7-82f2-124f6cceff11","Type":"ContainerStarted","Data":"058683ca8d2a732bfc93c3426ba434aa25440863f7b844968ad6abf4caafbc90"} Mar 17 11:45:51 crc kubenswrapper[4742]: I0317 11:45:51.446517 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" event={"ID":"62491de6-4c04-49d7-82f2-124f6cceff11","Type":"ContainerStarted","Data":"a38ae27b45d9a60407680bec27cb1a7e7b74da9377ef4e0a88bde06f1f37fb3c"} Mar 17 11:45:51 crc kubenswrapper[4742]: I0317 11:45:51.480590 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" podStartSLOduration=2.037953407 podStartE2EDuration="2.480563309s" podCreationTimestamp="2026-03-17 11:45:49 +0000 UTC" firstStartedPulling="2026-03-17 11:45:50.361639196 +0000 UTC m=+2053.487766954" lastFinishedPulling="2026-03-17 11:45:50.804249078 +0000 UTC m=+2053.930376856" observedRunningTime="2026-03-17 11:45:51.469357953 +0000 UTC m=+2054.595485741" watchObservedRunningTime="2026-03-17 11:45:51.480563309 +0000 UTC m=+2054.606691107" Mar 17 11:45:55 crc kubenswrapper[4742]: I0317 11:45:55.296654 4742 scope.go:117] "RemoveContainer" containerID="1bd6c3af4257784dba8c27c428d01b42ed4a63b5b3177825dd94022475d606c1" Mar 17 11:46:00 crc kubenswrapper[4742]: I0317 11:46:00.158131 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562466-z9vmr"] Mar 17 11:46:00 crc kubenswrapper[4742]: I0317 11:46:00.159947 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562466-z9vmr" Mar 17 11:46:00 crc kubenswrapper[4742]: I0317 11:46:00.162957 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:46:00 crc kubenswrapper[4742]: I0317 11:46:00.164112 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:46:00 crc kubenswrapper[4742]: I0317 11:46:00.164378 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:46:00 crc kubenswrapper[4742]: I0317 11:46:00.178311 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562466-z9vmr"] Mar 17 11:46:00 crc kubenswrapper[4742]: I0317 11:46:00.276365 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfktc\" (UniqueName: \"kubernetes.io/projected/4d1620f5-1ec3-4841-85d3-c162d7e62454-kube-api-access-sfktc\") pod \"auto-csr-approver-29562466-z9vmr\" (UID: \"4d1620f5-1ec3-4841-85d3-c162d7e62454\") " pod="openshift-infra/auto-csr-approver-29562466-z9vmr" Mar 17 11:46:00 crc kubenswrapper[4742]: I0317 11:46:00.378934 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfktc\" (UniqueName: \"kubernetes.io/projected/4d1620f5-1ec3-4841-85d3-c162d7e62454-kube-api-access-sfktc\") pod \"auto-csr-approver-29562466-z9vmr\" (UID: \"4d1620f5-1ec3-4841-85d3-c162d7e62454\") " pod="openshift-infra/auto-csr-approver-29562466-z9vmr" Mar 17 11:46:00 crc kubenswrapper[4742]: I0317 11:46:00.413734 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfktc\" (UniqueName: \"kubernetes.io/projected/4d1620f5-1ec3-4841-85d3-c162d7e62454-kube-api-access-sfktc\") pod \"auto-csr-approver-29562466-z9vmr\" (UID: \"4d1620f5-1ec3-4841-85d3-c162d7e62454\") " pod="openshift-infra/auto-csr-approver-29562466-z9vmr" Mar 17 11:46:00 crc kubenswrapper[4742]: I0317 11:46:00.487748 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562466-z9vmr" Mar 17 11:46:00 crc kubenswrapper[4742]: I0317 11:46:00.982037 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562466-z9vmr"] Mar 17 11:46:01 crc kubenswrapper[4742]: I0317 11:46:01.547211 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562466-z9vmr" event={"ID":"4d1620f5-1ec3-4841-85d3-c162d7e62454","Type":"ContainerStarted","Data":"da9cd8d85cd5d32b98f783ca98f1a1ebad5a07d5d3ea7b844f43ec8b2bcef9a4"} Mar 17 11:46:02 crc kubenswrapper[4742]: I0317 11:46:02.560023 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562466-z9vmr" event={"ID":"4d1620f5-1ec3-4841-85d3-c162d7e62454","Type":"ContainerStarted","Data":"653150bc1e23148b7ab0b6c2417a4318fd0b5b4a4929d6f22f6786b0aeb66151"} Mar 17 11:46:02 crc kubenswrapper[4742]: I0317 11:46:02.592692 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562466-z9vmr" podStartSLOduration=1.612582292 podStartE2EDuration="2.59266791s" podCreationTimestamp="2026-03-17 11:46:00 +0000 UTC" firstStartedPulling="2026-03-17 11:46:00.987509451 +0000 UTC m=+2064.113637229" lastFinishedPulling="2026-03-17 11:46:01.967595049 +0000 UTC m=+2065.093722847" observedRunningTime="2026-03-17 11:46:02.579523825 +0000 UTC m=+2065.705651613" watchObservedRunningTime="2026-03-17 11:46:02.59266791 +0000 UTC m=+2065.718795688" Mar 17 11:46:03 crc kubenswrapper[4742]: I0317 11:46:03.593288 4742 generic.go:334] "Generic (PLEG): container finished" podID="4d1620f5-1ec3-4841-85d3-c162d7e62454" containerID="653150bc1e23148b7ab0b6c2417a4318fd0b5b4a4929d6f22f6786b0aeb66151" exitCode=0 Mar 17 11:46:03 crc kubenswrapper[4742]: I0317 11:46:03.593366 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562466-z9vmr" event={"ID":"4d1620f5-1ec3-4841-85d3-c162d7e62454","Type":"ContainerDied","Data":"653150bc1e23148b7ab0b6c2417a4318fd0b5b4a4929d6f22f6786b0aeb66151"} Mar 17 11:46:05 crc kubenswrapper[4742]: I0317 11:46:05.041932 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562466-z9vmr" Mar 17 11:46:05 crc kubenswrapper[4742]: I0317 11:46:05.072094 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfktc\" (UniqueName: \"kubernetes.io/projected/4d1620f5-1ec3-4841-85d3-c162d7e62454-kube-api-access-sfktc\") pod \"4d1620f5-1ec3-4841-85d3-c162d7e62454\" (UID: \"4d1620f5-1ec3-4841-85d3-c162d7e62454\") " Mar 17 11:46:05 crc kubenswrapper[4742]: I0317 11:46:05.086206 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d1620f5-1ec3-4841-85d3-c162d7e62454-kube-api-access-sfktc" (OuterVolumeSpecName: "kube-api-access-sfktc") pod "4d1620f5-1ec3-4841-85d3-c162d7e62454" (UID: "4d1620f5-1ec3-4841-85d3-c162d7e62454"). InnerVolumeSpecName "kube-api-access-sfktc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:46:05 crc kubenswrapper[4742]: I0317 11:46:05.174440 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfktc\" (UniqueName: \"kubernetes.io/projected/4d1620f5-1ec3-4841-85d3-c162d7e62454-kube-api-access-sfktc\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:05 crc kubenswrapper[4742]: I0317 11:46:05.620480 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562466-z9vmr" event={"ID":"4d1620f5-1ec3-4841-85d3-c162d7e62454","Type":"ContainerDied","Data":"da9cd8d85cd5d32b98f783ca98f1a1ebad5a07d5d3ea7b844f43ec8b2bcef9a4"} Mar 17 11:46:05 crc kubenswrapper[4742]: I0317 11:46:05.620540 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da9cd8d85cd5d32b98f783ca98f1a1ebad5a07d5d3ea7b844f43ec8b2bcef9a4" Mar 17 11:46:05 crc kubenswrapper[4742]: I0317 11:46:05.620551 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562466-z9vmr" Mar 17 11:46:05 crc kubenswrapper[4742]: I0317 11:46:05.691641 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562460-6xd9m"] Mar 17 11:46:05 crc kubenswrapper[4742]: I0317 11:46:05.703414 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562460-6xd9m"] Mar 17 11:46:06 crc kubenswrapper[4742]: I0317 11:46:06.682960 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5d96418-f216-474c-a8e5-d73833a30fd8" path="/var/lib/kubelet/pods/f5d96418-f216-474c-a8e5-d73833a30fd8/volumes" Mar 17 11:46:18 crc kubenswrapper[4742]: I0317 11:46:18.044624 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:46:18 crc kubenswrapper[4742]: I0317 11:46:18.045378 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:46:29 crc kubenswrapper[4742]: I0317 11:46:29.895640 4742 generic.go:334] "Generic (PLEG): container finished" podID="62491de6-4c04-49d7-82f2-124f6cceff11" containerID="a38ae27b45d9a60407680bec27cb1a7e7b74da9377ef4e0a88bde06f1f37fb3c" exitCode=0 Mar 17 11:46:29 crc kubenswrapper[4742]: I0317 11:46:29.895764 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" event={"ID":"62491de6-4c04-49d7-82f2-124f6cceff11","Type":"ContainerDied","Data":"a38ae27b45d9a60407680bec27cb1a7e7b74da9377ef4e0a88bde06f1f37fb3c"} Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.396594 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.553132 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-ovn-combined-ca-bundle\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.553461 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.554175 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-inventory\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.554253 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-neutron-metadata-combined-ca-bundle\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.554331 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-ssh-key-openstack-edpm-ipam\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.554366 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.554422 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.554466 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsrc2\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-kube-api-access-xsrc2\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.554560 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-bootstrap-combined-ca-bundle\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.554622 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-telemetry-combined-ca-bundle\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.554659 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-libvirt-combined-ca-bundle\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.554711 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-nova-combined-ca-bundle\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.554753 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-ovn-default-certs-0\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.554775 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-repo-setup-combined-ca-bundle\") pod \"62491de6-4c04-49d7-82f2-124f6cceff11\" (UID: \"62491de6-4c04-49d7-82f2-124f6cceff11\") " Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.562523 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.562655 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.562827 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-kube-api-access-xsrc2" (OuterVolumeSpecName: "kube-api-access-xsrc2") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "kube-api-access-xsrc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.564617 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.565241 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.565335 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.565529 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.566392 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.566664 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.567606 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.568617 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.568944 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.599137 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-inventory" (OuterVolumeSpecName: "inventory") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.608176 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "62491de6-4c04-49d7-82f2-124f6cceff11" (UID: "62491de6-4c04-49d7-82f2-124f6cceff11"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.656861 4742 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.656899 4742 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.656985 4742 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.656999 4742 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.657013 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.657024 4742 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.657038 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.657049 4742 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.657062 4742 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.657075 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsrc2\" (UniqueName: \"kubernetes.io/projected/62491de6-4c04-49d7-82f2-124f6cceff11-kube-api-access-xsrc2\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.657085 4742 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.657097 4742 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.657109 4742 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.657120 4742 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62491de6-4c04-49d7-82f2-124f6cceff11-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.917995 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" event={"ID":"62491de6-4c04-49d7-82f2-124f6cceff11","Type":"ContainerDied","Data":"058683ca8d2a732bfc93c3426ba434aa25440863f7b844968ad6abf4caafbc90"} Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.918058 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="058683ca8d2a732bfc93c3426ba434aa25440863f7b844968ad6abf4caafbc90" Mar 17 11:46:31 crc kubenswrapper[4742]: I0317 11:46:31.918071 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-6g95k" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.054874 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w"] Mar 17 11:46:32 crc kubenswrapper[4742]: E0317 11:46:32.055619 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1620f5-1ec3-4841-85d3-c162d7e62454" containerName="oc" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.055729 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1620f5-1ec3-4841-85d3-c162d7e62454" containerName="oc" Mar 17 11:46:32 crc kubenswrapper[4742]: E0317 11:46:32.055842 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62491de6-4c04-49d7-82f2-124f6cceff11" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.055979 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="62491de6-4c04-49d7-82f2-124f6cceff11" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.056297 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="62491de6-4c04-49d7-82f2-124f6cceff11" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.056423 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1620f5-1ec3-4841-85d3-c162d7e62454" containerName="oc" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.057314 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.060145 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.060399 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.060618 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.060745 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.062930 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.070009 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w"] Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.171815 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.172071 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.172158 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.172283 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.172427 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjndz\" (UniqueName: \"kubernetes.io/projected/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-kube-api-access-wjndz\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.274032 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.274173 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.274242 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.274316 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.274381 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjndz\" (UniqueName: \"kubernetes.io/projected/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-kube-api-access-wjndz\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.276995 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.279893 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.282183 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.289448 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.298761 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjndz\" (UniqueName: \"kubernetes.io/projected/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-kube-api-access-wjndz\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zgs6w\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.383226 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:46:32 crc kubenswrapper[4742]: I0317 11:46:32.978095 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w"] Mar 17 11:46:33 crc kubenswrapper[4742]: I0317 11:46:33.940785 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" event={"ID":"9e7470ef-476f-4d0e-b7ec-349fbc6eff76","Type":"ContainerStarted","Data":"852b5d9edad1efdbff509d9eae12e403d19480c5eaf01bb675bf903dccbb299a"} Mar 17 11:46:33 crc kubenswrapper[4742]: I0317 11:46:33.941201 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" event={"ID":"9e7470ef-476f-4d0e-b7ec-349fbc6eff76","Type":"ContainerStarted","Data":"1e5939366617ab0e34361776d9d3b91ff2f40e397a68647c34aa66ff069befc4"} Mar 17 11:46:33 crc kubenswrapper[4742]: I0317 11:46:33.970825 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" podStartSLOduration=1.401378098 podStartE2EDuration="1.970797386s" podCreationTimestamp="2026-03-17 11:46:32 +0000 UTC" firstStartedPulling="2026-03-17 11:46:32.98237378 +0000 UTC m=+2096.108501578" lastFinishedPulling="2026-03-17 11:46:33.551793068 +0000 UTC m=+2096.677920866" observedRunningTime="2026-03-17 11:46:33.964542173 +0000 UTC m=+2097.090669991" watchObservedRunningTime="2026-03-17 11:46:33.970797386 +0000 UTC m=+2097.096925394" Mar 17 11:46:48 crc kubenswrapper[4742]: I0317 11:46:48.044954 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:46:48 crc kubenswrapper[4742]: I0317 11:46:48.045688 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:46:48 crc kubenswrapper[4742]: I0317 11:46:48.045758 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:46:48 crc kubenswrapper[4742]: I0317 11:46:48.046825 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3e0af6893b2594265c0b520ca2bca430428f6f884f7c0a9258384a451ab4bae"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 11:46:48 crc kubenswrapper[4742]: I0317 11:46:48.046969 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://f3e0af6893b2594265c0b520ca2bca430428f6f884f7c0a9258384a451ab4bae" gracePeriod=600 Mar 17 11:46:49 crc kubenswrapper[4742]: I0317 11:46:49.116460 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="f3e0af6893b2594265c0b520ca2bca430428f6f884f7c0a9258384a451ab4bae" exitCode=0 Mar 17 11:46:49 crc kubenswrapper[4742]: I0317 11:46:49.116572 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"f3e0af6893b2594265c0b520ca2bca430428f6f884f7c0a9258384a451ab4bae"} Mar 17 11:46:49 crc kubenswrapper[4742]: I0317 11:46:49.118200 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca"} Mar 17 11:46:49 crc kubenswrapper[4742]: I0317 11:46:49.118253 4742 scope.go:117] "RemoveContainer" containerID="0a44b0ad41b498f033f6429cf5290f88a5301d91e741bb6a6c84250be7af170d" Mar 17 11:46:55 crc kubenswrapper[4742]: I0317 11:46:55.385064 4742 scope.go:117] "RemoveContainer" containerID="338466036db2afa1f87e10d7311dfc4b58aebdb5b513416a8bc75539002cc133" Mar 17 11:47:41 crc kubenswrapper[4742]: I0317 11:47:41.704700 4742 generic.go:334] "Generic (PLEG): container finished" podID="9e7470ef-476f-4d0e-b7ec-349fbc6eff76" containerID="852b5d9edad1efdbff509d9eae12e403d19480c5eaf01bb675bf903dccbb299a" exitCode=0 Mar 17 11:47:41 crc kubenswrapper[4742]: I0317 11:47:41.704933 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" event={"ID":"9e7470ef-476f-4d0e-b7ec-349fbc6eff76","Type":"ContainerDied","Data":"852b5d9edad1efdbff509d9eae12e403d19480c5eaf01bb675bf903dccbb299a"} Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.242984 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q86lp"] Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.245318 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.260347 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.274441 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q86lp"] Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.310438 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a567a71d-c026-4181-b88e-17c1a4db42eb-utilities\") pod \"redhat-operators-q86lp\" (UID: \"a567a71d-c026-4181-b88e-17c1a4db42eb\") " pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.310612 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a567a71d-c026-4181-b88e-17c1a4db42eb-catalog-content\") pod \"redhat-operators-q86lp\" (UID: \"a567a71d-c026-4181-b88e-17c1a4db42eb\") " pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.310662 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jls4\" (UniqueName: \"kubernetes.io/projected/a567a71d-c026-4181-b88e-17c1a4db42eb-kube-api-access-7jls4\") pod \"redhat-operators-q86lp\" (UID: \"a567a71d-c026-4181-b88e-17c1a4db42eb\") " pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.412137 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ovncontroller-config-0\") pod \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.412198 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-inventory\") pod \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.412233 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjndz\" (UniqueName: \"kubernetes.io/projected/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-kube-api-access-wjndz\") pod \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.412261 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ovn-combined-ca-bundle\") pod \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.412337 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ssh-key-openstack-edpm-ipam\") pod \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\" (UID: \"9e7470ef-476f-4d0e-b7ec-349fbc6eff76\") " Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.412709 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a567a71d-c026-4181-b88e-17c1a4db42eb-utilities\") pod \"redhat-operators-q86lp\" (UID: \"a567a71d-c026-4181-b88e-17c1a4db42eb\") " pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.412862 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a567a71d-c026-4181-b88e-17c1a4db42eb-catalog-content\") pod \"redhat-operators-q86lp\" (UID: \"a567a71d-c026-4181-b88e-17c1a4db42eb\") " pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.412932 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jls4\" (UniqueName: \"kubernetes.io/projected/a567a71d-c026-4181-b88e-17c1a4db42eb-kube-api-access-7jls4\") pod \"redhat-operators-q86lp\" (UID: \"a567a71d-c026-4181-b88e-17c1a4db42eb\") " pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.413208 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a567a71d-c026-4181-b88e-17c1a4db42eb-utilities\") pod \"redhat-operators-q86lp\" (UID: \"a567a71d-c026-4181-b88e-17c1a4db42eb\") " pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.413260 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a567a71d-c026-4181-b88e-17c1a4db42eb-catalog-content\") pod \"redhat-operators-q86lp\" (UID: \"a567a71d-c026-4181-b88e-17c1a4db42eb\") " pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.422034 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9e7470ef-476f-4d0e-b7ec-349fbc6eff76" (UID: "9e7470ef-476f-4d0e-b7ec-349fbc6eff76"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.422225 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-kube-api-access-wjndz" (OuterVolumeSpecName: "kube-api-access-wjndz") pod "9e7470ef-476f-4d0e-b7ec-349fbc6eff76" (UID: "9e7470ef-476f-4d0e-b7ec-349fbc6eff76"). InnerVolumeSpecName "kube-api-access-wjndz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.433973 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jls4\" (UniqueName: \"kubernetes.io/projected/a567a71d-c026-4181-b88e-17c1a4db42eb-kube-api-access-7jls4\") pod \"redhat-operators-q86lp\" (UID: \"a567a71d-c026-4181-b88e-17c1a4db42eb\") " pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.441226 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9e7470ef-476f-4d0e-b7ec-349fbc6eff76" (UID: "9e7470ef-476f-4d0e-b7ec-349fbc6eff76"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.441637 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "9e7470ef-476f-4d0e-b7ec-349fbc6eff76" (UID: "9e7470ef-476f-4d0e-b7ec-349fbc6eff76"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.469574 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-inventory" (OuterVolumeSpecName: "inventory") pod "9e7470ef-476f-4d0e-b7ec-349fbc6eff76" (UID: "9e7470ef-476f-4d0e-b7ec-349fbc6eff76"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.516006 4742 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.516395 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.516473 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjndz\" (UniqueName: \"kubernetes.io/projected/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-kube-api-access-wjndz\") on node \"crc\" DevicePath \"\"" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.516488 4742 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.516500 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e7470ef-476f-4d0e-b7ec-349fbc6eff76-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.572959 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.730837 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" event={"ID":"9e7470ef-476f-4d0e-b7ec-349fbc6eff76","Type":"ContainerDied","Data":"1e5939366617ab0e34361776d9d3b91ff2f40e397a68647c34aa66ff069befc4"} Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.730875 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e5939366617ab0e34361776d9d3b91ff2f40e397a68647c34aa66ff069befc4" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.730940 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zgs6w" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.843595 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt"] Mar 17 11:47:43 crc kubenswrapper[4742]: E0317 11:47:43.844086 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e7470ef-476f-4d0e-b7ec-349fbc6eff76" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.844104 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e7470ef-476f-4d0e-b7ec-349fbc6eff76" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.844319 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e7470ef-476f-4d0e-b7ec-349fbc6eff76" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.845369 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.850263 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.850774 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.851001 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.851146 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.851280 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.851519 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.860324 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt"] Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.930192 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.930240 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbszq\" (UniqueName: \"kubernetes.io/projected/764bf75a-9487-4005-b6ee-ca369e722c4a-kube-api-access-cbszq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.930313 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.930342 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.930367 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:43 crc kubenswrapper[4742]: I0317 11:47:43.930424 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.032215 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.032274 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.032304 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.032368 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.032428 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.033011 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbszq\" (UniqueName: \"kubernetes.io/projected/764bf75a-9487-4005-b6ee-ca369e722c4a-kube-api-access-cbszq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.036606 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.036606 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.036619 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.037380 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.037417 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.064246 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q86lp"] Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.069821 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbszq\" (UniqueName: \"kubernetes.io/projected/764bf75a-9487-4005-b6ee-ca369e722c4a-kube-api-access-cbszq\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.178313 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.741931 4742 generic.go:334] "Generic (PLEG): container finished" podID="a567a71d-c026-4181-b88e-17c1a4db42eb" containerID="e80cb23891b11b82e96612f4f7379c2944a748dca3b1021b6d58a74686dea834" exitCode=0 Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.745740 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q86lp" event={"ID":"a567a71d-c026-4181-b88e-17c1a4db42eb","Type":"ContainerDied","Data":"e80cb23891b11b82e96612f4f7379c2944a748dca3b1021b6d58a74686dea834"} Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.745816 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q86lp" event={"ID":"a567a71d-c026-4181-b88e-17c1a4db42eb","Type":"ContainerStarted","Data":"cdd24a12dff3c2178a298a0a2959208bb8064cc496613d9344544e890177fab3"} Mar 17 11:47:44 crc kubenswrapper[4742]: I0317 11:47:44.764871 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt"] Mar 17 11:47:45 crc kubenswrapper[4742]: I0317 11:47:45.756145 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" event={"ID":"764bf75a-9487-4005-b6ee-ca369e722c4a","Type":"ContainerStarted","Data":"80574a4059dd449d17488293b3571c2918804a9778f6df94ca41ba2dbf9723b8"} Mar 17 11:47:45 crc kubenswrapper[4742]: I0317 11:47:45.757711 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" event={"ID":"764bf75a-9487-4005-b6ee-ca369e722c4a","Type":"ContainerStarted","Data":"2db4dc8741d8eb251bc77b6c6992166cdb1f6adce902ec60966a7cfcb846a5c3"} Mar 17 11:47:45 crc kubenswrapper[4742]: I0317 11:47:45.779238 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" podStartSLOduration=2.193199584 podStartE2EDuration="2.779222833s" podCreationTimestamp="2026-03-17 11:47:43 +0000 UTC" firstStartedPulling="2026-03-17 11:47:44.77942493 +0000 UTC m=+2167.905552688" lastFinishedPulling="2026-03-17 11:47:45.365448169 +0000 UTC m=+2168.491575937" observedRunningTime="2026-03-17 11:47:45.774714857 +0000 UTC m=+2168.900842615" watchObservedRunningTime="2026-03-17 11:47:45.779222833 +0000 UTC m=+2168.905350591" Mar 17 11:47:46 crc kubenswrapper[4742]: I0317 11:47:46.784489 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q86lp" event={"ID":"a567a71d-c026-4181-b88e-17c1a4db42eb","Type":"ContainerStarted","Data":"881328b00886b3eb522c3ade6da19e925908b1cef618aa6182bd107fcb96007c"} Mar 17 11:47:48 crc kubenswrapper[4742]: I0317 11:47:48.799729 4742 generic.go:334] "Generic (PLEG): container finished" podID="a567a71d-c026-4181-b88e-17c1a4db42eb" containerID="881328b00886b3eb522c3ade6da19e925908b1cef618aa6182bd107fcb96007c" exitCode=0 Mar 17 11:47:48 crc kubenswrapper[4742]: I0317 11:47:48.799798 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q86lp" event={"ID":"a567a71d-c026-4181-b88e-17c1a4db42eb","Type":"ContainerDied","Data":"881328b00886b3eb522c3ade6da19e925908b1cef618aa6182bd107fcb96007c"} Mar 17 11:47:49 crc kubenswrapper[4742]: I0317 11:47:49.816881 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q86lp" event={"ID":"a567a71d-c026-4181-b88e-17c1a4db42eb","Type":"ContainerStarted","Data":"c96eb84776602443c8f89c346f75bdb8e22966eb5b85b263c7a817e5b064fec5"} Mar 17 11:47:49 crc kubenswrapper[4742]: I0317 11:47:49.848425 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q86lp" podStartSLOduration=2.438148466 podStartE2EDuration="6.848408243s" podCreationTimestamp="2026-03-17 11:47:43 +0000 UTC" firstStartedPulling="2026-03-17 11:47:44.776571961 +0000 UTC m=+2167.902699719" lastFinishedPulling="2026-03-17 11:47:49.186831718 +0000 UTC m=+2172.312959496" observedRunningTime="2026-03-17 11:47:49.842002445 +0000 UTC m=+2172.968130233" watchObservedRunningTime="2026-03-17 11:47:49.848408243 +0000 UTC m=+2172.974536001" Mar 17 11:47:53 crc kubenswrapper[4742]: I0317 11:47:53.574245 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:47:53 crc kubenswrapper[4742]: I0317 11:47:53.574717 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:47:54 crc kubenswrapper[4742]: I0317 11:47:54.661646 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-q86lp" podUID="a567a71d-c026-4181-b88e-17c1a4db42eb" containerName="registry-server" probeResult="failure" output=< Mar 17 11:47:54 crc kubenswrapper[4742]: timeout: failed to connect service ":50051" within 1s Mar 17 11:47:54 crc kubenswrapper[4742]: > Mar 17 11:48:00 crc kubenswrapper[4742]: I0317 11:48:00.158687 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562468-dwhfv"] Mar 17 11:48:00 crc kubenswrapper[4742]: I0317 11:48:00.161232 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562468-dwhfv" Mar 17 11:48:00 crc kubenswrapper[4742]: I0317 11:48:00.164124 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:48:00 crc kubenswrapper[4742]: I0317 11:48:00.164154 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:48:00 crc kubenswrapper[4742]: I0317 11:48:00.164521 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:48:00 crc kubenswrapper[4742]: I0317 11:48:00.170607 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562468-dwhfv"] Mar 17 11:48:00 crc kubenswrapper[4742]: I0317 11:48:00.318217 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbbg9\" (UniqueName: \"kubernetes.io/projected/69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0-kube-api-access-hbbg9\") pod \"auto-csr-approver-29562468-dwhfv\" (UID: \"69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0\") " pod="openshift-infra/auto-csr-approver-29562468-dwhfv" Mar 17 11:48:00 crc kubenswrapper[4742]: I0317 11:48:00.420737 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbbg9\" (UniqueName: \"kubernetes.io/projected/69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0-kube-api-access-hbbg9\") pod \"auto-csr-approver-29562468-dwhfv\" (UID: \"69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0\") " pod="openshift-infra/auto-csr-approver-29562468-dwhfv" Mar 17 11:48:00 crc kubenswrapper[4742]: I0317 11:48:00.449841 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbbg9\" (UniqueName: \"kubernetes.io/projected/69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0-kube-api-access-hbbg9\") pod \"auto-csr-approver-29562468-dwhfv\" (UID: \"69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0\") " pod="openshift-infra/auto-csr-approver-29562468-dwhfv" Mar 17 11:48:00 crc kubenswrapper[4742]: I0317 11:48:00.489502 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562468-dwhfv" Mar 17 11:48:00 crc kubenswrapper[4742]: W0317 11:48:00.944259 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69a5cf8a_b6bd_4cd5_b7b6_fbc90abaaea0.slice/crio-0835dc6ca03d4509cd79aa7255632d7d9249af3b01829e1c87ca4a18a00db424 WatchSource:0}: Error finding container 0835dc6ca03d4509cd79aa7255632d7d9249af3b01829e1c87ca4a18a00db424: Status 404 returned error can't find the container with id 0835dc6ca03d4509cd79aa7255632d7d9249af3b01829e1c87ca4a18a00db424 Mar 17 11:48:00 crc kubenswrapper[4742]: I0317 11:48:00.948345 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562468-dwhfv"] Mar 17 11:48:01 crc kubenswrapper[4742]: I0317 11:48:01.979676 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562468-dwhfv" event={"ID":"69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0","Type":"ContainerStarted","Data":"0835dc6ca03d4509cd79aa7255632d7d9249af3b01829e1c87ca4a18a00db424"} Mar 17 11:48:03 crc kubenswrapper[4742]: I0317 11:48:03.002963 4742 generic.go:334] "Generic (PLEG): container finished" podID="69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0" containerID="f4cae3987486b6b32db2c0eb80fb9aad24c86de206830bf9ad232970128bd8e3" exitCode=0 Mar 17 11:48:03 crc kubenswrapper[4742]: I0317 11:48:03.003109 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562468-dwhfv" event={"ID":"69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0","Type":"ContainerDied","Data":"f4cae3987486b6b32db2c0eb80fb9aad24c86de206830bf9ad232970128bd8e3"} Mar 17 11:48:03 crc kubenswrapper[4742]: I0317 11:48:03.654747 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:48:03 crc kubenswrapper[4742]: I0317 11:48:03.716279 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:48:03 crc kubenswrapper[4742]: I0317 11:48:03.913085 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q86lp"] Mar 17 11:48:04 crc kubenswrapper[4742]: I0317 11:48:04.422753 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562468-dwhfv" Mar 17 11:48:04 crc kubenswrapper[4742]: I0317 11:48:04.503635 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbbg9\" (UniqueName: \"kubernetes.io/projected/69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0-kube-api-access-hbbg9\") pod \"69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0\" (UID: \"69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0\") " Mar 17 11:48:04 crc kubenswrapper[4742]: I0317 11:48:04.512227 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0-kube-api-access-hbbg9" (OuterVolumeSpecName: "kube-api-access-hbbg9") pod "69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0" (UID: "69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0"). InnerVolumeSpecName "kube-api-access-hbbg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:48:04 crc kubenswrapper[4742]: I0317 11:48:04.606549 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbbg9\" (UniqueName: \"kubernetes.io/projected/69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0-kube-api-access-hbbg9\") on node \"crc\" DevicePath \"\"" Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.031467 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q86lp" podUID="a567a71d-c026-4181-b88e-17c1a4db42eb" containerName="registry-server" containerID="cri-o://c96eb84776602443c8f89c346f75bdb8e22966eb5b85b263c7a817e5b064fec5" gracePeriod=2 Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.031749 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562468-dwhfv" Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.031774 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562468-dwhfv" event={"ID":"69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0","Type":"ContainerDied","Data":"0835dc6ca03d4509cd79aa7255632d7d9249af3b01829e1c87ca4a18a00db424"} Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.032180 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0835dc6ca03d4509cd79aa7255632d7d9249af3b01829e1c87ca4a18a00db424" Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.500553 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562462-8bc7k"] Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.507361 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562462-8bc7k"] Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.510244 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.634441 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a567a71d-c026-4181-b88e-17c1a4db42eb-catalog-content\") pod \"a567a71d-c026-4181-b88e-17c1a4db42eb\" (UID: \"a567a71d-c026-4181-b88e-17c1a4db42eb\") " Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.635161 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jls4\" (UniqueName: \"kubernetes.io/projected/a567a71d-c026-4181-b88e-17c1a4db42eb-kube-api-access-7jls4\") pod \"a567a71d-c026-4181-b88e-17c1a4db42eb\" (UID: \"a567a71d-c026-4181-b88e-17c1a4db42eb\") " Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.635230 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a567a71d-c026-4181-b88e-17c1a4db42eb-utilities\") pod \"a567a71d-c026-4181-b88e-17c1a4db42eb\" (UID: \"a567a71d-c026-4181-b88e-17c1a4db42eb\") " Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.635801 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a567a71d-c026-4181-b88e-17c1a4db42eb-utilities" (OuterVolumeSpecName: "utilities") pod "a567a71d-c026-4181-b88e-17c1a4db42eb" (UID: "a567a71d-c026-4181-b88e-17c1a4db42eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.641187 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a567a71d-c026-4181-b88e-17c1a4db42eb-kube-api-access-7jls4" (OuterVolumeSpecName: "kube-api-access-7jls4") pod "a567a71d-c026-4181-b88e-17c1a4db42eb" (UID: "a567a71d-c026-4181-b88e-17c1a4db42eb"). InnerVolumeSpecName "kube-api-access-7jls4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.737999 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jls4\" (UniqueName: \"kubernetes.io/projected/a567a71d-c026-4181-b88e-17c1a4db42eb-kube-api-access-7jls4\") on node \"crc\" DevicePath \"\"" Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.738035 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a567a71d-c026-4181-b88e-17c1a4db42eb-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.764636 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a567a71d-c026-4181-b88e-17c1a4db42eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a567a71d-c026-4181-b88e-17c1a4db42eb" (UID: "a567a71d-c026-4181-b88e-17c1a4db42eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:48:05 crc kubenswrapper[4742]: I0317 11:48:05.839474 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a567a71d-c026-4181-b88e-17c1a4db42eb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.043022 4742 generic.go:334] "Generic (PLEG): container finished" podID="a567a71d-c026-4181-b88e-17c1a4db42eb" containerID="c96eb84776602443c8f89c346f75bdb8e22966eb5b85b263c7a817e5b064fec5" exitCode=0 Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.043077 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q86lp" event={"ID":"a567a71d-c026-4181-b88e-17c1a4db42eb","Type":"ContainerDied","Data":"c96eb84776602443c8f89c346f75bdb8e22966eb5b85b263c7a817e5b064fec5"} Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.043126 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q86lp" event={"ID":"a567a71d-c026-4181-b88e-17c1a4db42eb","Type":"ContainerDied","Data":"cdd24a12dff3c2178a298a0a2959208bb8064cc496613d9344544e890177fab3"} Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.043156 4742 scope.go:117] "RemoveContainer" containerID="c96eb84776602443c8f89c346f75bdb8e22966eb5b85b263c7a817e5b064fec5" Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.043159 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q86lp" Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.061882 4742 scope.go:117] "RemoveContainer" containerID="881328b00886b3eb522c3ade6da19e925908b1cef618aa6182bd107fcb96007c" Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.084975 4742 scope.go:117] "RemoveContainer" containerID="e80cb23891b11b82e96612f4f7379c2944a748dca3b1021b6d58a74686dea834" Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.150645 4742 scope.go:117] "RemoveContainer" containerID="c96eb84776602443c8f89c346f75bdb8e22966eb5b85b263c7a817e5b064fec5" Mar 17 11:48:06 crc kubenswrapper[4742]: E0317 11:48:06.156454 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c96eb84776602443c8f89c346f75bdb8e22966eb5b85b263c7a817e5b064fec5\": container with ID starting with c96eb84776602443c8f89c346f75bdb8e22966eb5b85b263c7a817e5b064fec5 not found: ID does not exist" containerID="c96eb84776602443c8f89c346f75bdb8e22966eb5b85b263c7a817e5b064fec5" Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.156523 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c96eb84776602443c8f89c346f75bdb8e22966eb5b85b263c7a817e5b064fec5"} err="failed to get container status \"c96eb84776602443c8f89c346f75bdb8e22966eb5b85b263c7a817e5b064fec5\": rpc error: code = NotFound desc = could not find container \"c96eb84776602443c8f89c346f75bdb8e22966eb5b85b263c7a817e5b064fec5\": container with ID starting with c96eb84776602443c8f89c346f75bdb8e22966eb5b85b263c7a817e5b064fec5 not found: ID does not exist" Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.156563 4742 scope.go:117] "RemoveContainer" containerID="881328b00886b3eb522c3ade6da19e925908b1cef618aa6182bd107fcb96007c" Mar 17 11:48:06 crc kubenswrapper[4742]: E0317 11:48:06.157188 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"881328b00886b3eb522c3ade6da19e925908b1cef618aa6182bd107fcb96007c\": container with ID starting with 881328b00886b3eb522c3ade6da19e925908b1cef618aa6182bd107fcb96007c not found: ID does not exist" containerID="881328b00886b3eb522c3ade6da19e925908b1cef618aa6182bd107fcb96007c" Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.157228 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"881328b00886b3eb522c3ade6da19e925908b1cef618aa6182bd107fcb96007c"} err="failed to get container status \"881328b00886b3eb522c3ade6da19e925908b1cef618aa6182bd107fcb96007c\": rpc error: code = NotFound desc = could not find container \"881328b00886b3eb522c3ade6da19e925908b1cef618aa6182bd107fcb96007c\": container with ID starting with 881328b00886b3eb522c3ade6da19e925908b1cef618aa6182bd107fcb96007c not found: ID does not exist" Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.157252 4742 scope.go:117] "RemoveContainer" containerID="e80cb23891b11b82e96612f4f7379c2944a748dca3b1021b6d58a74686dea834" Mar 17 11:48:06 crc kubenswrapper[4742]: E0317 11:48:06.157630 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e80cb23891b11b82e96612f4f7379c2944a748dca3b1021b6d58a74686dea834\": container with ID starting with e80cb23891b11b82e96612f4f7379c2944a748dca3b1021b6d58a74686dea834 not found: ID does not exist" containerID="e80cb23891b11b82e96612f4f7379c2944a748dca3b1021b6d58a74686dea834" Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.157671 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e80cb23891b11b82e96612f4f7379c2944a748dca3b1021b6d58a74686dea834"} err="failed to get container status \"e80cb23891b11b82e96612f4f7379c2944a748dca3b1021b6d58a74686dea834\": rpc error: code = NotFound desc = could not find container \"e80cb23891b11b82e96612f4f7379c2944a748dca3b1021b6d58a74686dea834\": container with ID starting with e80cb23891b11b82e96612f4f7379c2944a748dca3b1021b6d58a74686dea834 not found: ID does not exist" Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.158085 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q86lp"] Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.170372 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q86lp"] Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.677832 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a567a71d-c026-4181-b88e-17c1a4db42eb" path="/var/lib/kubelet/pods/a567a71d-c026-4181-b88e-17c1a4db42eb/volumes" Mar 17 11:48:06 crc kubenswrapper[4742]: I0317 11:48:06.679325 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8bc9906-5271-4a69-8fa3-e5106f062ac2" path="/var/lib/kubelet/pods/e8bc9906-5271-4a69-8fa3-e5106f062ac2/volumes" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.699529 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n5sxb"] Mar 17 11:48:28 crc kubenswrapper[4742]: E0317 11:48:28.701629 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a567a71d-c026-4181-b88e-17c1a4db42eb" containerName="registry-server" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.701645 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a567a71d-c026-4181-b88e-17c1a4db42eb" containerName="registry-server" Mar 17 11:48:28 crc kubenswrapper[4742]: E0317 11:48:28.701667 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0" containerName="oc" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.701694 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0" containerName="oc" Mar 17 11:48:28 crc kubenswrapper[4742]: E0317 11:48:28.701730 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a567a71d-c026-4181-b88e-17c1a4db42eb" containerName="extract-utilities" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.701736 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a567a71d-c026-4181-b88e-17c1a4db42eb" containerName="extract-utilities" Mar 17 11:48:28 crc kubenswrapper[4742]: E0317 11:48:28.701812 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a567a71d-c026-4181-b88e-17c1a4db42eb" containerName="extract-content" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.701821 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a567a71d-c026-4181-b88e-17c1a4db42eb" containerName="extract-content" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.702229 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0" containerName="oc" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.702263 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="a567a71d-c026-4181-b88e-17c1a4db42eb" containerName="registry-server" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.707184 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n5sxb"] Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.707613 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.789842 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-utilities\") pod \"certified-operators-n5sxb\" (UID: \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\") " pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.789884 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssmz9\" (UniqueName: \"kubernetes.io/projected/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-kube-api-access-ssmz9\") pod \"certified-operators-n5sxb\" (UID: \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\") " pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.790395 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-catalog-content\") pod \"certified-operators-n5sxb\" (UID: \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\") " pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.892360 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-catalog-content\") pod \"certified-operators-n5sxb\" (UID: \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\") " pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.892526 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-utilities\") pod \"certified-operators-n5sxb\" (UID: \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\") " pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.892553 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssmz9\" (UniqueName: \"kubernetes.io/projected/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-kube-api-access-ssmz9\") pod \"certified-operators-n5sxb\" (UID: \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\") " pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.892983 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-catalog-content\") pod \"certified-operators-n5sxb\" (UID: \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\") " pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.893088 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-utilities\") pod \"certified-operators-n5sxb\" (UID: \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\") " pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:28 crc kubenswrapper[4742]: I0317 11:48:28.920667 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssmz9\" (UniqueName: \"kubernetes.io/projected/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-kube-api-access-ssmz9\") pod \"certified-operators-n5sxb\" (UID: \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\") " pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:29 crc kubenswrapper[4742]: I0317 11:48:29.030188 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:29 crc kubenswrapper[4742]: I0317 11:48:29.577160 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n5sxb"] Mar 17 11:48:30 crc kubenswrapper[4742]: I0317 11:48:30.512117 4742 generic.go:334] "Generic (PLEG): container finished" podID="42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" containerID="aa7d91cc53d20a91b64625e148df394b0b38d434cd1891bebdc78ea3268697e8" exitCode=0 Mar 17 11:48:30 crc kubenswrapper[4742]: I0317 11:48:30.512232 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5sxb" event={"ID":"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a","Type":"ContainerDied","Data":"aa7d91cc53d20a91b64625e148df394b0b38d434cd1891bebdc78ea3268697e8"} Mar 17 11:48:30 crc kubenswrapper[4742]: I0317 11:48:30.512499 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5sxb" event={"ID":"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a","Type":"ContainerStarted","Data":"b554bc2309cd67e4f6f5a3321fc78f22d438667c12fadf27f57c1a69b248d586"} Mar 17 11:48:30 crc kubenswrapper[4742]: I0317 11:48:30.516091 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 11:48:31 crc kubenswrapper[4742]: I0317 11:48:31.521921 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5sxb" event={"ID":"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a","Type":"ContainerStarted","Data":"dcb57e2aac554aef413a49962d51cd928b7a36bc22bc375a1a1f0d4f3287fd12"} Mar 17 11:48:32 crc kubenswrapper[4742]: I0317 11:48:32.532417 4742 generic.go:334] "Generic (PLEG): container finished" podID="42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" containerID="dcb57e2aac554aef413a49962d51cd928b7a36bc22bc375a1a1f0d4f3287fd12" exitCode=0 Mar 17 11:48:32 crc kubenswrapper[4742]: I0317 11:48:32.532485 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5sxb" event={"ID":"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a","Type":"ContainerDied","Data":"dcb57e2aac554aef413a49962d51cd928b7a36bc22bc375a1a1f0d4f3287fd12"} Mar 17 11:48:33 crc kubenswrapper[4742]: I0317 11:48:33.544541 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5sxb" event={"ID":"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a","Type":"ContainerStarted","Data":"b5bb753ef738f01c09c46a3b78830b1b7d16e0aca10b99c311f0229b93cd308f"} Mar 17 11:48:33 crc kubenswrapper[4742]: I0317 11:48:33.569379 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n5sxb" podStartSLOduration=3.122002464 podStartE2EDuration="5.569358219s" podCreationTimestamp="2026-03-17 11:48:28 +0000 UTC" firstStartedPulling="2026-03-17 11:48:30.515794578 +0000 UTC m=+2213.641922346" lastFinishedPulling="2026-03-17 11:48:32.963150343 +0000 UTC m=+2216.089278101" observedRunningTime="2026-03-17 11:48:33.563857577 +0000 UTC m=+2216.689985355" watchObservedRunningTime="2026-03-17 11:48:33.569358219 +0000 UTC m=+2216.695485977" Mar 17 11:48:36 crc kubenswrapper[4742]: I0317 11:48:36.575612 4742 generic.go:334] "Generic (PLEG): container finished" podID="764bf75a-9487-4005-b6ee-ca369e722c4a" containerID="80574a4059dd449d17488293b3571c2918804a9778f6df94ca41ba2dbf9723b8" exitCode=0 Mar 17 11:48:36 crc kubenswrapper[4742]: I0317 11:48:36.575780 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" event={"ID":"764bf75a-9487-4005-b6ee-ca369e722c4a","Type":"ContainerDied","Data":"80574a4059dd449d17488293b3571c2918804a9778f6df94ca41ba2dbf9723b8"} Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.002891 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.179565 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbszq\" (UniqueName: \"kubernetes.io/projected/764bf75a-9487-4005-b6ee-ca369e722c4a-kube-api-access-cbszq\") pod \"764bf75a-9487-4005-b6ee-ca369e722c4a\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.180251 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-neutron-metadata-combined-ca-bundle\") pod \"764bf75a-9487-4005-b6ee-ca369e722c4a\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.180412 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-ssh-key-openstack-edpm-ipam\") pod \"764bf75a-9487-4005-b6ee-ca369e722c4a\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.180617 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-nova-metadata-neutron-config-0\") pod \"764bf75a-9487-4005-b6ee-ca369e722c4a\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.180708 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-inventory\") pod \"764bf75a-9487-4005-b6ee-ca369e722c4a\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.180811 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-neutron-ovn-metadata-agent-neutron-config-0\") pod \"764bf75a-9487-4005-b6ee-ca369e722c4a\" (UID: \"764bf75a-9487-4005-b6ee-ca369e722c4a\") " Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.185326 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/764bf75a-9487-4005-b6ee-ca369e722c4a-kube-api-access-cbszq" (OuterVolumeSpecName: "kube-api-access-cbszq") pod "764bf75a-9487-4005-b6ee-ca369e722c4a" (UID: "764bf75a-9487-4005-b6ee-ca369e722c4a"). InnerVolumeSpecName "kube-api-access-cbszq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.185763 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "764bf75a-9487-4005-b6ee-ca369e722c4a" (UID: "764bf75a-9487-4005-b6ee-ca369e722c4a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.212163 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-inventory" (OuterVolumeSpecName: "inventory") pod "764bf75a-9487-4005-b6ee-ca369e722c4a" (UID: "764bf75a-9487-4005-b6ee-ca369e722c4a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.219890 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "764bf75a-9487-4005-b6ee-ca369e722c4a" (UID: "764bf75a-9487-4005-b6ee-ca369e722c4a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.227843 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "764bf75a-9487-4005-b6ee-ca369e722c4a" (UID: "764bf75a-9487-4005-b6ee-ca369e722c4a"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.233841 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "764bf75a-9487-4005-b6ee-ca369e722c4a" (UID: "764bf75a-9487-4005-b6ee-ca369e722c4a"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.283440 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbszq\" (UniqueName: \"kubernetes.io/projected/764bf75a-9487-4005-b6ee-ca369e722c4a-kube-api-access-cbszq\") on node \"crc\" DevicePath \"\"" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.283488 4742 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.283510 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.283524 4742 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.283541 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.283560 4742 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/764bf75a-9487-4005-b6ee-ca369e722c4a-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.597427 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" event={"ID":"764bf75a-9487-4005-b6ee-ca369e722c4a","Type":"ContainerDied","Data":"2db4dc8741d8eb251bc77b6c6992166cdb1f6adce902ec60966a7cfcb846a5c3"} Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.597506 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2db4dc8741d8eb251bc77b6c6992166cdb1f6adce902ec60966a7cfcb846a5c3" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.597513 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.709419 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9"] Mar 17 11:48:38 crc kubenswrapper[4742]: E0317 11:48:38.713357 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="764bf75a-9487-4005-b6ee-ca369e722c4a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.713391 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="764bf75a-9487-4005-b6ee-ca369e722c4a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.713591 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="764bf75a-9487-4005-b6ee-ca369e722c4a" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.714219 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.716154 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.716382 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.717850 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.718121 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.719229 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.727461 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9"] Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.894570 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.894627 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.894716 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.894745 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6npbv\" (UniqueName: \"kubernetes.io/projected/7fd024b3-844f-4118-92b5-81dcc6da9fd6-kube-api-access-6npbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.894784 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.996875 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.996966 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6npbv\" (UniqueName: \"kubernetes.io/projected/7fd024b3-844f-4118-92b5-81dcc6da9fd6-kube-api-access-6npbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.997024 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.997135 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:38 crc kubenswrapper[4742]: I0317 11:48:38.997178 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:39 crc kubenswrapper[4742]: I0317 11:48:39.000481 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:39 crc kubenswrapper[4742]: I0317 11:48:39.001586 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:39 crc kubenswrapper[4742]: I0317 11:48:39.002147 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:39 crc kubenswrapper[4742]: I0317 11:48:39.004000 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:39 crc kubenswrapper[4742]: I0317 11:48:39.013059 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6npbv\" (UniqueName: \"kubernetes.io/projected/7fd024b3-844f-4118-92b5-81dcc6da9fd6-kube-api-access-6npbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:39 crc kubenswrapper[4742]: I0317 11:48:39.030671 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:39 crc kubenswrapper[4742]: I0317 11:48:39.030939 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:39 crc kubenswrapper[4742]: I0317 11:48:39.053391 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:48:39 crc kubenswrapper[4742]: I0317 11:48:39.098328 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:39 crc kubenswrapper[4742]: I0317 11:48:39.397261 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9"] Mar 17 11:48:39 crc kubenswrapper[4742]: I0317 11:48:39.607019 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" event={"ID":"7fd024b3-844f-4118-92b5-81dcc6da9fd6","Type":"ContainerStarted","Data":"b68f745e97db4c5cafd995db63ede69dd7468010d264aac205c52b277fbbaeaa"} Mar 17 11:48:39 crc kubenswrapper[4742]: I0317 11:48:39.654647 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:39 crc kubenswrapper[4742]: I0317 11:48:39.705483 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n5sxb"] Mar 17 11:48:40 crc kubenswrapper[4742]: I0317 11:48:40.619466 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" event={"ID":"7fd024b3-844f-4118-92b5-81dcc6da9fd6","Type":"ContainerStarted","Data":"ecdf63b8b4171171266f48d9e154fda9b49c6ccb8997821fe4870ca7644a2069"} Mar 17 11:48:40 crc kubenswrapper[4742]: I0317 11:48:40.652462 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" podStartSLOduration=2.204443378 podStartE2EDuration="2.652441196s" podCreationTimestamp="2026-03-17 11:48:38 +0000 UTC" firstStartedPulling="2026-03-17 11:48:39.403043553 +0000 UTC m=+2222.529171331" lastFinishedPulling="2026-03-17 11:48:39.851041371 +0000 UTC m=+2222.977169149" observedRunningTime="2026-03-17 11:48:40.642114131 +0000 UTC m=+2223.768241899" watchObservedRunningTime="2026-03-17 11:48:40.652441196 +0000 UTC m=+2223.778568964" Mar 17 11:48:41 crc kubenswrapper[4742]: I0317 11:48:41.633473 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n5sxb" podUID="42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" containerName="registry-server" containerID="cri-o://b5bb753ef738f01c09c46a3b78830b1b7d16e0aca10b99c311f0229b93cd308f" gracePeriod=2 Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.124835 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.273901 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-catalog-content\") pod \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\" (UID: \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\") " Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.273986 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssmz9\" (UniqueName: \"kubernetes.io/projected/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-kube-api-access-ssmz9\") pod \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\" (UID: \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\") " Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.274198 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-utilities\") pod \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\" (UID: \"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a\") " Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.275059 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-utilities" (OuterVolumeSpecName: "utilities") pod "42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" (UID: "42b7e86e-1fbb-4f57-aee2-a5b1df40e17a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.281986 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-kube-api-access-ssmz9" (OuterVolumeSpecName: "kube-api-access-ssmz9") pod "42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" (UID: "42b7e86e-1fbb-4f57-aee2-a5b1df40e17a"). InnerVolumeSpecName "kube-api-access-ssmz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.330969 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" (UID: "42b7e86e-1fbb-4f57-aee2-a5b1df40e17a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.376234 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.376272 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.376283 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssmz9\" (UniqueName: \"kubernetes.io/projected/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a-kube-api-access-ssmz9\") on node \"crc\" DevicePath \"\"" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.643845 4742 generic.go:334] "Generic (PLEG): container finished" podID="42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" containerID="b5bb753ef738f01c09c46a3b78830b1b7d16e0aca10b99c311f0229b93cd308f" exitCode=0 Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.643950 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5sxb" event={"ID":"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a","Type":"ContainerDied","Data":"b5bb753ef738f01c09c46a3b78830b1b7d16e0aca10b99c311f0229b93cd308f"} Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.644233 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n5sxb" event={"ID":"42b7e86e-1fbb-4f57-aee2-a5b1df40e17a","Type":"ContainerDied","Data":"b554bc2309cd67e4f6f5a3321fc78f22d438667c12fadf27f57c1a69b248d586"} Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.644258 4742 scope.go:117] "RemoveContainer" containerID="b5bb753ef738f01c09c46a3b78830b1b7d16e0aca10b99c311f0229b93cd308f" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.643967 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n5sxb" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.675349 4742 scope.go:117] "RemoveContainer" containerID="dcb57e2aac554aef413a49962d51cd928b7a36bc22bc375a1a1f0d4f3287fd12" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.686325 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n5sxb"] Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.693314 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n5sxb"] Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.694341 4742 scope.go:117] "RemoveContainer" containerID="aa7d91cc53d20a91b64625e148df394b0b38d434cd1891bebdc78ea3268697e8" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.740533 4742 scope.go:117] "RemoveContainer" containerID="b5bb753ef738f01c09c46a3b78830b1b7d16e0aca10b99c311f0229b93cd308f" Mar 17 11:48:42 crc kubenswrapper[4742]: E0317 11:48:42.744205 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5bb753ef738f01c09c46a3b78830b1b7d16e0aca10b99c311f0229b93cd308f\": container with ID starting with b5bb753ef738f01c09c46a3b78830b1b7d16e0aca10b99c311f0229b93cd308f not found: ID does not exist" containerID="b5bb753ef738f01c09c46a3b78830b1b7d16e0aca10b99c311f0229b93cd308f" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.744247 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5bb753ef738f01c09c46a3b78830b1b7d16e0aca10b99c311f0229b93cd308f"} err="failed to get container status \"b5bb753ef738f01c09c46a3b78830b1b7d16e0aca10b99c311f0229b93cd308f\": rpc error: code = NotFound desc = could not find container \"b5bb753ef738f01c09c46a3b78830b1b7d16e0aca10b99c311f0229b93cd308f\": container with ID starting with b5bb753ef738f01c09c46a3b78830b1b7d16e0aca10b99c311f0229b93cd308f not found: ID does not exist" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.744273 4742 scope.go:117] "RemoveContainer" containerID="dcb57e2aac554aef413a49962d51cd928b7a36bc22bc375a1a1f0d4f3287fd12" Mar 17 11:48:42 crc kubenswrapper[4742]: E0317 11:48:42.744761 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcb57e2aac554aef413a49962d51cd928b7a36bc22bc375a1a1f0d4f3287fd12\": container with ID starting with dcb57e2aac554aef413a49962d51cd928b7a36bc22bc375a1a1f0d4f3287fd12 not found: ID does not exist" containerID="dcb57e2aac554aef413a49962d51cd928b7a36bc22bc375a1a1f0d4f3287fd12" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.744804 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcb57e2aac554aef413a49962d51cd928b7a36bc22bc375a1a1f0d4f3287fd12"} err="failed to get container status \"dcb57e2aac554aef413a49962d51cd928b7a36bc22bc375a1a1f0d4f3287fd12\": rpc error: code = NotFound desc = could not find container \"dcb57e2aac554aef413a49962d51cd928b7a36bc22bc375a1a1f0d4f3287fd12\": container with ID starting with dcb57e2aac554aef413a49962d51cd928b7a36bc22bc375a1a1f0d4f3287fd12 not found: ID does not exist" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.744835 4742 scope.go:117] "RemoveContainer" containerID="aa7d91cc53d20a91b64625e148df394b0b38d434cd1891bebdc78ea3268697e8" Mar 17 11:48:42 crc kubenswrapper[4742]: E0317 11:48:42.745159 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa7d91cc53d20a91b64625e148df394b0b38d434cd1891bebdc78ea3268697e8\": container with ID starting with aa7d91cc53d20a91b64625e148df394b0b38d434cd1891bebdc78ea3268697e8 not found: ID does not exist" containerID="aa7d91cc53d20a91b64625e148df394b0b38d434cd1891bebdc78ea3268697e8" Mar 17 11:48:42 crc kubenswrapper[4742]: I0317 11:48:42.745193 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7d91cc53d20a91b64625e148df394b0b38d434cd1891bebdc78ea3268697e8"} err="failed to get container status \"aa7d91cc53d20a91b64625e148df394b0b38d434cd1891bebdc78ea3268697e8\": rpc error: code = NotFound desc = could not find container \"aa7d91cc53d20a91b64625e148df394b0b38d434cd1891bebdc78ea3268697e8\": container with ID starting with aa7d91cc53d20a91b64625e148df394b0b38d434cd1891bebdc78ea3268697e8 not found: ID does not exist" Mar 17 11:48:44 crc kubenswrapper[4742]: I0317 11:48:44.672247 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" path="/var/lib/kubelet/pods/42b7e86e-1fbb-4f57-aee2-a5b1df40e17a/volumes" Mar 17 11:48:48 crc kubenswrapper[4742]: I0317 11:48:48.043867 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:48:48 crc kubenswrapper[4742]: I0317 11:48:48.045521 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:48:55 crc kubenswrapper[4742]: I0317 11:48:55.533566 4742 scope.go:117] "RemoveContainer" containerID="b6f9d44cd7e38ad91669d5e736d3b37c406dec4a78ae39cf25b269eb0eaeefd3" Mar 17 11:49:18 crc kubenswrapper[4742]: I0317 11:49:18.044546 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:49:18 crc kubenswrapper[4742]: I0317 11:49:18.046097 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:49:48 crc kubenswrapper[4742]: I0317 11:49:48.044418 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:49:48 crc kubenswrapper[4742]: I0317 11:49:48.045247 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:49:48 crc kubenswrapper[4742]: I0317 11:49:48.045322 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:49:48 crc kubenswrapper[4742]: I0317 11:49:48.046443 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 11:49:48 crc kubenswrapper[4742]: I0317 11:49:48.046546 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" gracePeriod=600 Mar 17 11:49:48 crc kubenswrapper[4742]: E0317 11:49:48.195232 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:49:48 crc kubenswrapper[4742]: I0317 11:49:48.280205 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" exitCode=0 Mar 17 11:49:48 crc kubenswrapper[4742]: I0317 11:49:48.280274 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca"} Mar 17 11:49:48 crc kubenswrapper[4742]: I0317 11:49:48.280341 4742 scope.go:117] "RemoveContainer" containerID="f3e0af6893b2594265c0b520ca2bca430428f6f884f7c0a9258384a451ab4bae" Mar 17 11:49:48 crc kubenswrapper[4742]: I0317 11:49:48.281396 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:49:48 crc kubenswrapper[4742]: E0317 11:49:48.281954 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.165507 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562470-l9qjt"] Mar 17 11:50:00 crc kubenswrapper[4742]: E0317 11:50:00.166567 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" containerName="extract-content" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.166589 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" containerName="extract-content" Mar 17 11:50:00 crc kubenswrapper[4742]: E0317 11:50:00.166629 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" containerName="registry-server" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.166641 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" containerName="registry-server" Mar 17 11:50:00 crc kubenswrapper[4742]: E0317 11:50:00.166675 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" containerName="extract-utilities" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.166688 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" containerName="extract-utilities" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.167058 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b7e86e-1fbb-4f57-aee2-a5b1df40e17a" containerName="registry-server" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.168168 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562470-l9qjt" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.171018 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.171125 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.171400 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.177512 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562470-l9qjt"] Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.249004 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqnck\" (UniqueName: \"kubernetes.io/projected/f447ef44-8f08-4573-b884-fa3a098bc306-kube-api-access-hqnck\") pod \"auto-csr-approver-29562470-l9qjt\" (UID: \"f447ef44-8f08-4573-b884-fa3a098bc306\") " pod="openshift-infra/auto-csr-approver-29562470-l9qjt" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.351732 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqnck\" (UniqueName: \"kubernetes.io/projected/f447ef44-8f08-4573-b884-fa3a098bc306-kube-api-access-hqnck\") pod \"auto-csr-approver-29562470-l9qjt\" (UID: \"f447ef44-8f08-4573-b884-fa3a098bc306\") " pod="openshift-infra/auto-csr-approver-29562470-l9qjt" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.375513 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqnck\" (UniqueName: \"kubernetes.io/projected/f447ef44-8f08-4573-b884-fa3a098bc306-kube-api-access-hqnck\") pod \"auto-csr-approver-29562470-l9qjt\" (UID: \"f447ef44-8f08-4573-b884-fa3a098bc306\") " pod="openshift-infra/auto-csr-approver-29562470-l9qjt" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.498854 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562470-l9qjt" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.663331 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:50:00 crc kubenswrapper[4742]: E0317 11:50:00.663935 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:50:00 crc kubenswrapper[4742]: I0317 11:50:00.977344 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562470-l9qjt"] Mar 17 11:50:01 crc kubenswrapper[4742]: I0317 11:50:01.435602 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562470-l9qjt" event={"ID":"f447ef44-8f08-4573-b884-fa3a098bc306","Type":"ContainerStarted","Data":"4f334b6392b89ab6b72468601e04196e6107886d6a5f93eb4ff0e9161e4c168d"} Mar 17 11:50:02 crc kubenswrapper[4742]: I0317 11:50:02.445152 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562470-l9qjt" event={"ID":"f447ef44-8f08-4573-b884-fa3a098bc306","Type":"ContainerStarted","Data":"a6e05879ddb53b0bf63695538fe9524047bd7010c4daf32e55e9a8e14977ea5b"} Mar 17 11:50:02 crc kubenswrapper[4742]: I0317 11:50:02.464388 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562470-l9qjt" podStartSLOduration=1.574874755 podStartE2EDuration="2.464361422s" podCreationTimestamp="2026-03-17 11:50:00 +0000 UTC" firstStartedPulling="2026-03-17 11:50:00.994978425 +0000 UTC m=+2304.121106203" lastFinishedPulling="2026-03-17 11:50:01.884465072 +0000 UTC m=+2305.010592870" observedRunningTime="2026-03-17 11:50:02.458444048 +0000 UTC m=+2305.584571816" watchObservedRunningTime="2026-03-17 11:50:02.464361422 +0000 UTC m=+2305.590489190" Mar 17 11:50:03 crc kubenswrapper[4742]: I0317 11:50:03.462859 4742 generic.go:334] "Generic (PLEG): container finished" podID="f447ef44-8f08-4573-b884-fa3a098bc306" containerID="a6e05879ddb53b0bf63695538fe9524047bd7010c4daf32e55e9a8e14977ea5b" exitCode=0 Mar 17 11:50:03 crc kubenswrapper[4742]: I0317 11:50:03.463019 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562470-l9qjt" event={"ID":"f447ef44-8f08-4573-b884-fa3a098bc306","Type":"ContainerDied","Data":"a6e05879ddb53b0bf63695538fe9524047bd7010c4daf32e55e9a8e14977ea5b"} Mar 17 11:50:04 crc kubenswrapper[4742]: I0317 11:50:04.878006 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562470-l9qjt" Mar 17 11:50:04 crc kubenswrapper[4742]: I0317 11:50:04.986276 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqnck\" (UniqueName: \"kubernetes.io/projected/f447ef44-8f08-4573-b884-fa3a098bc306-kube-api-access-hqnck\") pod \"f447ef44-8f08-4573-b884-fa3a098bc306\" (UID: \"f447ef44-8f08-4573-b884-fa3a098bc306\") " Mar 17 11:50:04 crc kubenswrapper[4742]: I0317 11:50:04.999134 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f447ef44-8f08-4573-b884-fa3a098bc306-kube-api-access-hqnck" (OuterVolumeSpecName: "kube-api-access-hqnck") pod "f447ef44-8f08-4573-b884-fa3a098bc306" (UID: "f447ef44-8f08-4573-b884-fa3a098bc306"). InnerVolumeSpecName "kube-api-access-hqnck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:50:05 crc kubenswrapper[4742]: I0317 11:50:05.088734 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqnck\" (UniqueName: \"kubernetes.io/projected/f447ef44-8f08-4573-b884-fa3a098bc306-kube-api-access-hqnck\") on node \"crc\" DevicePath \"\"" Mar 17 11:50:05 crc kubenswrapper[4742]: I0317 11:50:05.487977 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562470-l9qjt" event={"ID":"f447ef44-8f08-4573-b884-fa3a098bc306","Type":"ContainerDied","Data":"4f334b6392b89ab6b72468601e04196e6107886d6a5f93eb4ff0e9161e4c168d"} Mar 17 11:50:05 crc kubenswrapper[4742]: I0317 11:50:05.488020 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f334b6392b89ab6b72468601e04196e6107886d6a5f93eb4ff0e9161e4c168d" Mar 17 11:50:05 crc kubenswrapper[4742]: I0317 11:50:05.488028 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562470-l9qjt" Mar 17 11:50:05 crc kubenswrapper[4742]: I0317 11:50:05.546995 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562464-mnq7n"] Mar 17 11:50:05 crc kubenswrapper[4742]: I0317 11:50:05.555805 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562464-mnq7n"] Mar 17 11:50:06 crc kubenswrapper[4742]: I0317 11:50:06.676057 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0236d66a-1c05-4000-928f-449316a872d2" path="/var/lib/kubelet/pods/0236d66a-1c05-4000-928f-449316a872d2/volumes" Mar 17 11:50:11 crc kubenswrapper[4742]: I0317 11:50:11.662716 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:50:11 crc kubenswrapper[4742]: E0317 11:50:11.663312 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:50:25 crc kubenswrapper[4742]: I0317 11:50:25.663659 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:50:25 crc kubenswrapper[4742]: E0317 11:50:25.664592 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:50:37 crc kubenswrapper[4742]: I0317 11:50:37.663642 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:50:37 crc kubenswrapper[4742]: E0317 11:50:37.665027 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:50:50 crc kubenswrapper[4742]: I0317 11:50:50.663189 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:50:50 crc kubenswrapper[4742]: E0317 11:50:50.664181 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:50:55 crc kubenswrapper[4742]: I0317 11:50:55.711533 4742 scope.go:117] "RemoveContainer" containerID="2ae2c8b474ef4c683309b76ef58c23b812c57753185be69f98636e91ab4d4390" Mar 17 11:51:04 crc kubenswrapper[4742]: I0317 11:51:04.663748 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:51:04 crc kubenswrapper[4742]: E0317 11:51:04.664999 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:51:17 crc kubenswrapper[4742]: I0317 11:51:17.663327 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:51:17 crc kubenswrapper[4742]: E0317 11:51:17.664778 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:51:29 crc kubenswrapper[4742]: I0317 11:51:29.663306 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:51:29 crc kubenswrapper[4742]: E0317 11:51:29.664565 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:51:40 crc kubenswrapper[4742]: I0317 11:51:40.663702 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:51:40 crc kubenswrapper[4742]: E0317 11:51:40.666067 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:51:51 crc kubenswrapper[4742]: I0317 11:51:51.665447 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:51:51 crc kubenswrapper[4742]: E0317 11:51:51.666437 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:52:00 crc kubenswrapper[4742]: I0317 11:52:00.168296 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562472-mrxmh"] Mar 17 11:52:00 crc kubenswrapper[4742]: E0317 11:52:00.169228 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f447ef44-8f08-4573-b884-fa3a098bc306" containerName="oc" Mar 17 11:52:00 crc kubenswrapper[4742]: I0317 11:52:00.169252 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f447ef44-8f08-4573-b884-fa3a098bc306" containerName="oc" Mar 17 11:52:00 crc kubenswrapper[4742]: I0317 11:52:00.169575 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f447ef44-8f08-4573-b884-fa3a098bc306" containerName="oc" Mar 17 11:52:00 crc kubenswrapper[4742]: I0317 11:52:00.172278 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562472-mrxmh" Mar 17 11:52:00 crc kubenswrapper[4742]: I0317 11:52:00.174810 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:52:00 crc kubenswrapper[4742]: I0317 11:52:00.175860 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:52:00 crc kubenswrapper[4742]: I0317 11:52:00.179182 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:52:00 crc kubenswrapper[4742]: I0317 11:52:00.193951 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562472-mrxmh"] Mar 17 11:52:00 crc kubenswrapper[4742]: I0317 11:52:00.261045 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sfdf\" (UniqueName: \"kubernetes.io/projected/f49e6b82-e398-4429-bacc-57c2ec258328-kube-api-access-9sfdf\") pod \"auto-csr-approver-29562472-mrxmh\" (UID: \"f49e6b82-e398-4429-bacc-57c2ec258328\") " pod="openshift-infra/auto-csr-approver-29562472-mrxmh" Mar 17 11:52:00 crc kubenswrapper[4742]: I0317 11:52:00.363947 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sfdf\" (UniqueName: \"kubernetes.io/projected/f49e6b82-e398-4429-bacc-57c2ec258328-kube-api-access-9sfdf\") pod \"auto-csr-approver-29562472-mrxmh\" (UID: \"f49e6b82-e398-4429-bacc-57c2ec258328\") " pod="openshift-infra/auto-csr-approver-29562472-mrxmh" Mar 17 11:52:00 crc kubenswrapper[4742]: I0317 11:52:00.387620 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sfdf\" (UniqueName: \"kubernetes.io/projected/f49e6b82-e398-4429-bacc-57c2ec258328-kube-api-access-9sfdf\") pod \"auto-csr-approver-29562472-mrxmh\" (UID: \"f49e6b82-e398-4429-bacc-57c2ec258328\") " pod="openshift-infra/auto-csr-approver-29562472-mrxmh" Mar 17 11:52:00 crc kubenswrapper[4742]: I0317 11:52:00.504748 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562472-mrxmh" Mar 17 11:52:01 crc kubenswrapper[4742]: I0317 11:52:01.064006 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562472-mrxmh"] Mar 17 11:52:01 crc kubenswrapper[4742]: I0317 11:52:01.832885 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562472-mrxmh" event={"ID":"f49e6b82-e398-4429-bacc-57c2ec258328","Type":"ContainerStarted","Data":"10a4179c7247bddfafe08cb4aa9bcb318066f9e48555beed11528be31510a1c7"} Mar 17 11:52:02 crc kubenswrapper[4742]: I0317 11:52:02.845026 4742 generic.go:334] "Generic (PLEG): container finished" podID="f49e6b82-e398-4429-bacc-57c2ec258328" containerID="b53b3649b8d974792d06967a63d509ea06e3589d42c005a9b0b82985202dc535" exitCode=0 Mar 17 11:52:02 crc kubenswrapper[4742]: I0317 11:52:02.845156 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562472-mrxmh" event={"ID":"f49e6b82-e398-4429-bacc-57c2ec258328","Type":"ContainerDied","Data":"b53b3649b8d974792d06967a63d509ea06e3589d42c005a9b0b82985202dc535"} Mar 17 11:52:03 crc kubenswrapper[4742]: I0317 11:52:03.663678 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:52:03 crc kubenswrapper[4742]: E0317 11:52:03.663978 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:52:04 crc kubenswrapper[4742]: I0317 11:52:04.292607 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562472-mrxmh" Mar 17 11:52:04 crc kubenswrapper[4742]: I0317 11:52:04.342891 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sfdf\" (UniqueName: \"kubernetes.io/projected/f49e6b82-e398-4429-bacc-57c2ec258328-kube-api-access-9sfdf\") pod \"f49e6b82-e398-4429-bacc-57c2ec258328\" (UID: \"f49e6b82-e398-4429-bacc-57c2ec258328\") " Mar 17 11:52:04 crc kubenswrapper[4742]: I0317 11:52:04.351099 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f49e6b82-e398-4429-bacc-57c2ec258328-kube-api-access-9sfdf" (OuterVolumeSpecName: "kube-api-access-9sfdf") pod "f49e6b82-e398-4429-bacc-57c2ec258328" (UID: "f49e6b82-e398-4429-bacc-57c2ec258328"). InnerVolumeSpecName "kube-api-access-9sfdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:52:04 crc kubenswrapper[4742]: I0317 11:52:04.445112 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sfdf\" (UniqueName: \"kubernetes.io/projected/f49e6b82-e398-4429-bacc-57c2ec258328-kube-api-access-9sfdf\") on node \"crc\" DevicePath \"\"" Mar 17 11:52:04 crc kubenswrapper[4742]: I0317 11:52:04.872903 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562472-mrxmh" event={"ID":"f49e6b82-e398-4429-bacc-57c2ec258328","Type":"ContainerDied","Data":"10a4179c7247bddfafe08cb4aa9bcb318066f9e48555beed11528be31510a1c7"} Mar 17 11:52:04 crc kubenswrapper[4742]: I0317 11:52:04.873292 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10a4179c7247bddfafe08cb4aa9bcb318066f9e48555beed11528be31510a1c7" Mar 17 11:52:04 crc kubenswrapper[4742]: I0317 11:52:04.873013 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562472-mrxmh" Mar 17 11:52:05 crc kubenswrapper[4742]: I0317 11:52:05.386350 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562466-z9vmr"] Mar 17 11:52:05 crc kubenswrapper[4742]: I0317 11:52:05.410525 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562466-z9vmr"] Mar 17 11:52:06 crc kubenswrapper[4742]: I0317 11:52:06.678686 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d1620f5-1ec3-4841-85d3-c162d7e62454" path="/var/lib/kubelet/pods/4d1620f5-1ec3-4841-85d3-c162d7e62454/volumes" Mar 17 11:52:14 crc kubenswrapper[4742]: I0317 11:52:14.663257 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:52:14 crc kubenswrapper[4742]: E0317 11:52:14.664538 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:52:27 crc kubenswrapper[4742]: I0317 11:52:27.663831 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:52:27 crc kubenswrapper[4742]: E0317 11:52:27.665104 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:52:41 crc kubenswrapper[4742]: I0317 11:52:41.663382 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:52:41 crc kubenswrapper[4742]: E0317 11:52:41.664268 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:52:51 crc kubenswrapper[4742]: I0317 11:52:51.405459 4742 generic.go:334] "Generic (PLEG): container finished" podID="7fd024b3-844f-4118-92b5-81dcc6da9fd6" containerID="ecdf63b8b4171171266f48d9e154fda9b49c6ccb8997821fe4870ca7644a2069" exitCode=0 Mar 17 11:52:51 crc kubenswrapper[4742]: I0317 11:52:51.405546 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" event={"ID":"7fd024b3-844f-4118-92b5-81dcc6da9fd6","Type":"ContainerDied","Data":"ecdf63b8b4171171266f48d9e154fda9b49c6ccb8997821fe4870ca7644a2069"} Mar 17 11:52:52 crc kubenswrapper[4742]: I0317 11:52:52.889784 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:52:52 crc kubenswrapper[4742]: I0317 11:52:52.917630 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-libvirt-secret-0\") pod \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " Mar 17 11:52:52 crc kubenswrapper[4742]: I0317 11:52:52.917779 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6npbv\" (UniqueName: \"kubernetes.io/projected/7fd024b3-844f-4118-92b5-81dcc6da9fd6-kube-api-access-6npbv\") pod \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " Mar 17 11:52:52 crc kubenswrapper[4742]: I0317 11:52:52.925404 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fd024b3-844f-4118-92b5-81dcc6da9fd6-kube-api-access-6npbv" (OuterVolumeSpecName: "kube-api-access-6npbv") pod "7fd024b3-844f-4118-92b5-81dcc6da9fd6" (UID: "7fd024b3-844f-4118-92b5-81dcc6da9fd6"). InnerVolumeSpecName "kube-api-access-6npbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:52:52 crc kubenswrapper[4742]: I0317 11:52:52.952135 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7fd024b3-844f-4118-92b5-81dcc6da9fd6" (UID: "7fd024b3-844f-4118-92b5-81dcc6da9fd6"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.019354 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-inventory\") pod \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.019401 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-libvirt-combined-ca-bundle\") pod \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.019429 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-ssh-key-openstack-edpm-ipam\") pod \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\" (UID: \"7fd024b3-844f-4118-92b5-81dcc6da9fd6\") " Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.020038 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6npbv\" (UniqueName: \"kubernetes.io/projected/7fd024b3-844f-4118-92b5-81dcc6da9fd6-kube-api-access-6npbv\") on node \"crc\" DevicePath \"\"" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.020061 4742 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.022532 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7fd024b3-844f-4118-92b5-81dcc6da9fd6" (UID: "7fd024b3-844f-4118-92b5-81dcc6da9fd6"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.044137 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-inventory" (OuterVolumeSpecName: "inventory") pod "7fd024b3-844f-4118-92b5-81dcc6da9fd6" (UID: "7fd024b3-844f-4118-92b5-81dcc6da9fd6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.050070 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7fd024b3-844f-4118-92b5-81dcc6da9fd6" (UID: "7fd024b3-844f-4118-92b5-81dcc6da9fd6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.123933 4742 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.123972 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.123990 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7fd024b3-844f-4118-92b5-81dcc6da9fd6-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.430691 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" event={"ID":"7fd024b3-844f-4118-92b5-81dcc6da9fd6","Type":"ContainerDied","Data":"b68f745e97db4c5cafd995db63ede69dd7468010d264aac205c52b277fbbaeaa"} Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.430775 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b68f745e97db4c5cafd995db63ede69dd7468010d264aac205c52b277fbbaeaa" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.430788 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.557026 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7"] Mar 17 11:52:53 crc kubenswrapper[4742]: E0317 11:52:53.557600 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49e6b82-e398-4429-bacc-57c2ec258328" containerName="oc" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.557624 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49e6b82-e398-4429-bacc-57c2ec258328" containerName="oc" Mar 17 11:52:53 crc kubenswrapper[4742]: E0317 11:52:53.557644 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd024b3-844f-4118-92b5-81dcc6da9fd6" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.557653 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd024b3-844f-4118-92b5-81dcc6da9fd6" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.557872 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49e6b82-e398-4429-bacc-57c2ec258328" containerName="oc" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.557901 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fd024b3-844f-4118-92b5-81dcc6da9fd6" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.558701 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.561467 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.561725 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.561875 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.562881 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.563089 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.563204 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.567248 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7"] Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.567470 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.734336 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.734503 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.735367 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.735445 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.735480 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.735517 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mnk6\" (UniqueName: \"kubernetes.io/projected/6468192a-58e3-4b66-9551-1d67dc93f0ae-kube-api-access-7mnk6\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.735563 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.735646 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.735729 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.736202 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.736349 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.837988 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mnk6\" (UniqueName: \"kubernetes.io/projected/6468192a-58e3-4b66-9551-1d67dc93f0ae-kube-api-access-7mnk6\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.838131 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.838186 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.838260 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.838319 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.838388 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.838479 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.838539 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.838586 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.838649 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.838691 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.840649 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.845374 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.845848 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.845844 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.848943 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.850900 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.851781 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.852965 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.855663 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.857380 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.862280 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mnk6\" (UniqueName: \"kubernetes.io/projected/6468192a-58e3-4b66-9551-1d67dc93f0ae-kube-api-access-7mnk6\") pod \"nova-edpm-deployment-openstack-edpm-ipam-76jn7\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:53 crc kubenswrapper[4742]: I0317 11:52:53.931450 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:52:55 crc kubenswrapper[4742]: I0317 11:52:55.059601 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7"] Mar 17 11:52:55 crc kubenswrapper[4742]: I0317 11:52:55.458479 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" event={"ID":"6468192a-58e3-4b66-9551-1d67dc93f0ae","Type":"ContainerStarted","Data":"154b6ce5d54fb1cec325c0b1b1839372710f1d52250d772935184c6d9ecf5e5c"} Mar 17 11:52:55 crc kubenswrapper[4742]: I0317 11:52:55.663230 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:52:55 crc kubenswrapper[4742]: E0317 11:52:55.663766 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:52:55 crc kubenswrapper[4742]: I0317 11:52:55.855746 4742 scope.go:117] "RemoveContainer" containerID="653150bc1e23148b7ab0b6c2417a4318fd0b5b4a4929d6f22f6786b0aeb66151" Mar 17 11:52:56 crc kubenswrapper[4742]: I0317 11:52:56.473198 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" event={"ID":"6468192a-58e3-4b66-9551-1d67dc93f0ae","Type":"ContainerStarted","Data":"efc6ca66e9e7b010b4631451b7caecc164b8c97910f41a336c99ea44e1f0ea83"} Mar 17 11:53:10 crc kubenswrapper[4742]: I0317 11:53:10.664107 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:53:10 crc kubenswrapper[4742]: E0317 11:53:10.664762 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:53:24 crc kubenswrapper[4742]: I0317 11:53:24.662823 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:53:24 crc kubenswrapper[4742]: E0317 11:53:24.663487 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:53:36 crc kubenswrapper[4742]: I0317 11:53:36.663694 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:53:36 crc kubenswrapper[4742]: E0317 11:53:36.664542 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:53:51 crc kubenswrapper[4742]: I0317 11:53:51.663564 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:53:51 crc kubenswrapper[4742]: E0317 11:53:51.664787 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:54:00 crc kubenswrapper[4742]: I0317 11:54:00.148221 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" podStartSLOduration=66.256633547 podStartE2EDuration="1m7.148197421s" podCreationTimestamp="2026-03-17 11:52:53 +0000 UTC" firstStartedPulling="2026-03-17 11:52:55.077749684 +0000 UTC m=+2478.203877482" lastFinishedPulling="2026-03-17 11:52:55.969313588 +0000 UTC m=+2479.095441356" observedRunningTime="2026-03-17 11:52:56.515216243 +0000 UTC m=+2479.641344021" watchObservedRunningTime="2026-03-17 11:54:00.148197421 +0000 UTC m=+2543.274325199" Mar 17 11:54:00 crc kubenswrapper[4742]: I0317 11:54:00.155249 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562474-k9hnw"] Mar 17 11:54:00 crc kubenswrapper[4742]: I0317 11:54:00.169740 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562474-k9hnw"] Mar 17 11:54:00 crc kubenswrapper[4742]: I0317 11:54:00.169880 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562474-k9hnw" Mar 17 11:54:00 crc kubenswrapper[4742]: I0317 11:54:00.172894 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:54:00 crc kubenswrapper[4742]: I0317 11:54:00.172894 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:54:00 crc kubenswrapper[4742]: I0317 11:54:00.173292 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:54:00 crc kubenswrapper[4742]: I0317 11:54:00.278496 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn26f\" (UniqueName: \"kubernetes.io/projected/9601ffdf-5d67-45e7-88da-12672c58e00e-kube-api-access-sn26f\") pod \"auto-csr-approver-29562474-k9hnw\" (UID: \"9601ffdf-5d67-45e7-88da-12672c58e00e\") " pod="openshift-infra/auto-csr-approver-29562474-k9hnw" Mar 17 11:54:00 crc kubenswrapper[4742]: I0317 11:54:00.380552 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn26f\" (UniqueName: \"kubernetes.io/projected/9601ffdf-5d67-45e7-88da-12672c58e00e-kube-api-access-sn26f\") pod \"auto-csr-approver-29562474-k9hnw\" (UID: \"9601ffdf-5d67-45e7-88da-12672c58e00e\") " pod="openshift-infra/auto-csr-approver-29562474-k9hnw" Mar 17 11:54:00 crc kubenswrapper[4742]: I0317 11:54:00.416210 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn26f\" (UniqueName: \"kubernetes.io/projected/9601ffdf-5d67-45e7-88da-12672c58e00e-kube-api-access-sn26f\") pod \"auto-csr-approver-29562474-k9hnw\" (UID: \"9601ffdf-5d67-45e7-88da-12672c58e00e\") " pod="openshift-infra/auto-csr-approver-29562474-k9hnw" Mar 17 11:54:00 crc kubenswrapper[4742]: I0317 11:54:00.491576 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562474-k9hnw" Mar 17 11:54:00 crc kubenswrapper[4742]: I0317 11:54:00.856003 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562474-k9hnw"] Mar 17 11:54:00 crc kubenswrapper[4742]: I0317 11:54:00.859310 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 11:54:01 crc kubenswrapper[4742]: I0317 11:54:01.214729 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562474-k9hnw" event={"ID":"9601ffdf-5d67-45e7-88da-12672c58e00e","Type":"ContainerStarted","Data":"024d681d3ac35abc07c8678ee7f4f93231ac0843e9df7c3211a0849fd5f7699f"} Mar 17 11:54:02 crc kubenswrapper[4742]: I0317 11:54:02.230925 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562474-k9hnw" event={"ID":"9601ffdf-5d67-45e7-88da-12672c58e00e","Type":"ContainerStarted","Data":"8f7a71723f081cfb9b32621aa20813e40d3a14bb63ad901b7e6005a37bb634b1"} Mar 17 11:54:02 crc kubenswrapper[4742]: I0317 11:54:02.256402 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562474-k9hnw" podStartSLOduration=1.2363892619999999 podStartE2EDuration="2.256376319s" podCreationTimestamp="2026-03-17 11:54:00 +0000 UTC" firstStartedPulling="2026-03-17 11:54:00.858834352 +0000 UTC m=+2543.984962140" lastFinishedPulling="2026-03-17 11:54:01.878821429 +0000 UTC m=+2545.004949197" observedRunningTime="2026-03-17 11:54:02.251157824 +0000 UTC m=+2545.377285592" watchObservedRunningTime="2026-03-17 11:54:02.256376319 +0000 UTC m=+2545.382504107" Mar 17 11:54:03 crc kubenswrapper[4742]: I0317 11:54:03.240816 4742 generic.go:334] "Generic (PLEG): container finished" podID="9601ffdf-5d67-45e7-88da-12672c58e00e" containerID="8f7a71723f081cfb9b32621aa20813e40d3a14bb63ad901b7e6005a37bb634b1" exitCode=0 Mar 17 11:54:03 crc kubenswrapper[4742]: I0317 11:54:03.240859 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562474-k9hnw" event={"ID":"9601ffdf-5d67-45e7-88da-12672c58e00e","Type":"ContainerDied","Data":"8f7a71723f081cfb9b32621aa20813e40d3a14bb63ad901b7e6005a37bb634b1"} Mar 17 11:54:04 crc kubenswrapper[4742]: I0317 11:54:04.653641 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562474-k9hnw" Mar 17 11:54:04 crc kubenswrapper[4742]: I0317 11:54:04.664675 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:54:04 crc kubenswrapper[4742]: E0317 11:54:04.665521 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:54:04 crc kubenswrapper[4742]: I0317 11:54:04.774968 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn26f\" (UniqueName: \"kubernetes.io/projected/9601ffdf-5d67-45e7-88da-12672c58e00e-kube-api-access-sn26f\") pod \"9601ffdf-5d67-45e7-88da-12672c58e00e\" (UID: \"9601ffdf-5d67-45e7-88da-12672c58e00e\") " Mar 17 11:54:04 crc kubenswrapper[4742]: I0317 11:54:04.781478 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9601ffdf-5d67-45e7-88da-12672c58e00e-kube-api-access-sn26f" (OuterVolumeSpecName: "kube-api-access-sn26f") pod "9601ffdf-5d67-45e7-88da-12672c58e00e" (UID: "9601ffdf-5d67-45e7-88da-12672c58e00e"). InnerVolumeSpecName "kube-api-access-sn26f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:54:04 crc kubenswrapper[4742]: I0317 11:54:04.877697 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn26f\" (UniqueName: \"kubernetes.io/projected/9601ffdf-5d67-45e7-88da-12672c58e00e-kube-api-access-sn26f\") on node \"crc\" DevicePath \"\"" Mar 17 11:54:05 crc kubenswrapper[4742]: I0317 11:54:05.266830 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562474-k9hnw" event={"ID":"9601ffdf-5d67-45e7-88da-12672c58e00e","Type":"ContainerDied","Data":"024d681d3ac35abc07c8678ee7f4f93231ac0843e9df7c3211a0849fd5f7699f"} Mar 17 11:54:05 crc kubenswrapper[4742]: I0317 11:54:05.266974 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="024d681d3ac35abc07c8678ee7f4f93231ac0843e9df7c3211a0849fd5f7699f" Mar 17 11:54:05 crc kubenswrapper[4742]: I0317 11:54:05.266998 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562474-k9hnw" Mar 17 11:54:05 crc kubenswrapper[4742]: I0317 11:54:05.341647 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562468-dwhfv"] Mar 17 11:54:05 crc kubenswrapper[4742]: I0317 11:54:05.350704 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562468-dwhfv"] Mar 17 11:54:06 crc kubenswrapper[4742]: I0317 11:54:06.682659 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0" path="/var/lib/kubelet/pods/69a5cf8a-b6bd-4cd5-b7b6-fbc90abaaea0/volumes" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.573051 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g8xmv"] Mar 17 11:54:10 crc kubenswrapper[4742]: E0317 11:54:10.574359 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9601ffdf-5d67-45e7-88da-12672c58e00e" containerName="oc" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.574384 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="9601ffdf-5d67-45e7-88da-12672c58e00e" containerName="oc" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.574787 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="9601ffdf-5d67-45e7-88da-12672c58e00e" containerName="oc" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.577609 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.591659 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8xmv"] Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.701183 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31680d2d-dcd5-4ecf-a19b-246b69d584e5-catalog-content\") pod \"community-operators-g8xmv\" (UID: \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\") " pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.701231 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f626q\" (UniqueName: \"kubernetes.io/projected/31680d2d-dcd5-4ecf-a19b-246b69d584e5-kube-api-access-f626q\") pod \"community-operators-g8xmv\" (UID: \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\") " pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.701280 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31680d2d-dcd5-4ecf-a19b-246b69d584e5-utilities\") pod \"community-operators-g8xmv\" (UID: \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\") " pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.803389 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f626q\" (UniqueName: \"kubernetes.io/projected/31680d2d-dcd5-4ecf-a19b-246b69d584e5-kube-api-access-f626q\") pod \"community-operators-g8xmv\" (UID: \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\") " pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.803561 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31680d2d-dcd5-4ecf-a19b-246b69d584e5-utilities\") pod \"community-operators-g8xmv\" (UID: \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\") " pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.803987 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31680d2d-dcd5-4ecf-a19b-246b69d584e5-catalog-content\") pod \"community-operators-g8xmv\" (UID: \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\") " pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.804319 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31680d2d-dcd5-4ecf-a19b-246b69d584e5-utilities\") pod \"community-operators-g8xmv\" (UID: \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\") " pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.805206 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31680d2d-dcd5-4ecf-a19b-246b69d584e5-catalog-content\") pod \"community-operators-g8xmv\" (UID: \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\") " pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.844270 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f626q\" (UniqueName: \"kubernetes.io/projected/31680d2d-dcd5-4ecf-a19b-246b69d584e5-kube-api-access-f626q\") pod \"community-operators-g8xmv\" (UID: \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\") " pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:10 crc kubenswrapper[4742]: I0317 11:54:10.925923 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:11 crc kubenswrapper[4742]: I0317 11:54:11.515445 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8xmv"] Mar 17 11:54:11 crc kubenswrapper[4742]: W0317 11:54:11.519626 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31680d2d_dcd5_4ecf_a19b_246b69d584e5.slice/crio-6013d72deef3e72ec87d68947ac1cb99d5e7ab7d55eab95af390283fbcc1eeb4 WatchSource:0}: Error finding container 6013d72deef3e72ec87d68947ac1cb99d5e7ab7d55eab95af390283fbcc1eeb4: Status 404 returned error can't find the container with id 6013d72deef3e72ec87d68947ac1cb99d5e7ab7d55eab95af390283fbcc1eeb4 Mar 17 11:54:12 crc kubenswrapper[4742]: I0317 11:54:12.348074 4742 generic.go:334] "Generic (PLEG): container finished" podID="31680d2d-dcd5-4ecf-a19b-246b69d584e5" containerID="fca018b4ea5dc4cad992d153caef56311427aed8f280b91e075763eb3edfaa7b" exitCode=0 Mar 17 11:54:12 crc kubenswrapper[4742]: I0317 11:54:12.348158 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8xmv" event={"ID":"31680d2d-dcd5-4ecf-a19b-246b69d584e5","Type":"ContainerDied","Data":"fca018b4ea5dc4cad992d153caef56311427aed8f280b91e075763eb3edfaa7b"} Mar 17 11:54:12 crc kubenswrapper[4742]: I0317 11:54:12.348699 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8xmv" event={"ID":"31680d2d-dcd5-4ecf-a19b-246b69d584e5","Type":"ContainerStarted","Data":"6013d72deef3e72ec87d68947ac1cb99d5e7ab7d55eab95af390283fbcc1eeb4"} Mar 17 11:54:12 crc kubenswrapper[4742]: I0317 11:54:12.958835 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xfqld"] Mar 17 11:54:12 crc kubenswrapper[4742]: I0317 11:54:12.962944 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:12 crc kubenswrapper[4742]: I0317 11:54:12.978979 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xfqld"] Mar 17 11:54:13 crc kubenswrapper[4742]: I0317 11:54:13.053257 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrvlt\" (UniqueName: \"kubernetes.io/projected/0b2be836-5107-4b16-9210-594dfd932c40-kube-api-access-qrvlt\") pod \"redhat-marketplace-xfqld\" (UID: \"0b2be836-5107-4b16-9210-594dfd932c40\") " pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:13 crc kubenswrapper[4742]: I0317 11:54:13.053338 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2be836-5107-4b16-9210-594dfd932c40-catalog-content\") pod \"redhat-marketplace-xfqld\" (UID: \"0b2be836-5107-4b16-9210-594dfd932c40\") " pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:13 crc kubenswrapper[4742]: I0317 11:54:13.053435 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2be836-5107-4b16-9210-594dfd932c40-utilities\") pod \"redhat-marketplace-xfqld\" (UID: \"0b2be836-5107-4b16-9210-594dfd932c40\") " pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:13 crc kubenswrapper[4742]: I0317 11:54:13.155346 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrvlt\" (UniqueName: \"kubernetes.io/projected/0b2be836-5107-4b16-9210-594dfd932c40-kube-api-access-qrvlt\") pod \"redhat-marketplace-xfqld\" (UID: \"0b2be836-5107-4b16-9210-594dfd932c40\") " pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:13 crc kubenswrapper[4742]: I0317 11:54:13.155486 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2be836-5107-4b16-9210-594dfd932c40-catalog-content\") pod \"redhat-marketplace-xfqld\" (UID: \"0b2be836-5107-4b16-9210-594dfd932c40\") " pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:13 crc kubenswrapper[4742]: I0317 11:54:13.155650 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2be836-5107-4b16-9210-594dfd932c40-utilities\") pod \"redhat-marketplace-xfqld\" (UID: \"0b2be836-5107-4b16-9210-594dfd932c40\") " pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:13 crc kubenswrapper[4742]: I0317 11:54:13.156084 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2be836-5107-4b16-9210-594dfd932c40-catalog-content\") pod \"redhat-marketplace-xfqld\" (UID: \"0b2be836-5107-4b16-9210-594dfd932c40\") " pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:13 crc kubenswrapper[4742]: I0317 11:54:13.156196 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2be836-5107-4b16-9210-594dfd932c40-utilities\") pod \"redhat-marketplace-xfqld\" (UID: \"0b2be836-5107-4b16-9210-594dfd932c40\") " pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:13 crc kubenswrapper[4742]: I0317 11:54:13.182833 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrvlt\" (UniqueName: \"kubernetes.io/projected/0b2be836-5107-4b16-9210-594dfd932c40-kube-api-access-qrvlt\") pod \"redhat-marketplace-xfqld\" (UID: \"0b2be836-5107-4b16-9210-594dfd932c40\") " pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:13 crc kubenswrapper[4742]: I0317 11:54:13.303798 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:13 crc kubenswrapper[4742]: I0317 11:54:13.766648 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xfqld"] Mar 17 11:54:14 crc kubenswrapper[4742]: I0317 11:54:14.370399 4742 generic.go:334] "Generic (PLEG): container finished" podID="0b2be836-5107-4b16-9210-594dfd932c40" containerID="77d2920c565f1fad45d6515458e01311c687958bd2f408f10174f8944e6f8e94" exitCode=0 Mar 17 11:54:14 crc kubenswrapper[4742]: I0317 11:54:14.370552 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xfqld" event={"ID":"0b2be836-5107-4b16-9210-594dfd932c40","Type":"ContainerDied","Data":"77d2920c565f1fad45d6515458e01311c687958bd2f408f10174f8944e6f8e94"} Mar 17 11:54:14 crc kubenswrapper[4742]: I0317 11:54:14.370597 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xfqld" event={"ID":"0b2be836-5107-4b16-9210-594dfd932c40","Type":"ContainerStarted","Data":"95e73ad8daaf0618679042d0955e1c7de609f28d18d2a844f91124fe86b39330"} Mar 17 11:54:14 crc kubenswrapper[4742]: I0317 11:54:14.373599 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8xmv" event={"ID":"31680d2d-dcd5-4ecf-a19b-246b69d584e5","Type":"ContainerStarted","Data":"b593dfc5a9338fcede8152194cec4dfa76cb6f55430e5988022d2c2b04b743e2"} Mar 17 11:54:15 crc kubenswrapper[4742]: I0317 11:54:15.391038 4742 generic.go:334] "Generic (PLEG): container finished" podID="31680d2d-dcd5-4ecf-a19b-246b69d584e5" containerID="b593dfc5a9338fcede8152194cec4dfa76cb6f55430e5988022d2c2b04b743e2" exitCode=0 Mar 17 11:54:15 crc kubenswrapper[4742]: I0317 11:54:15.391144 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8xmv" event={"ID":"31680d2d-dcd5-4ecf-a19b-246b69d584e5","Type":"ContainerDied","Data":"b593dfc5a9338fcede8152194cec4dfa76cb6f55430e5988022d2c2b04b743e2"} Mar 17 11:54:16 crc kubenswrapper[4742]: I0317 11:54:16.405435 4742 generic.go:334] "Generic (PLEG): container finished" podID="0b2be836-5107-4b16-9210-594dfd932c40" containerID="f32e8563f67d5389c5735832753efdea170e59145f3fee7dfd3b854ee9502c16" exitCode=0 Mar 17 11:54:16 crc kubenswrapper[4742]: I0317 11:54:16.405530 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xfqld" event={"ID":"0b2be836-5107-4b16-9210-594dfd932c40","Type":"ContainerDied","Data":"f32e8563f67d5389c5735832753efdea170e59145f3fee7dfd3b854ee9502c16"} Mar 17 11:54:16 crc kubenswrapper[4742]: I0317 11:54:16.413177 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8xmv" event={"ID":"31680d2d-dcd5-4ecf-a19b-246b69d584e5","Type":"ContainerStarted","Data":"433fa45a6b4d763fbc6481ea93cc03c7fe484c45020bf19a21d315bbf6ea9c40"} Mar 17 11:54:16 crc kubenswrapper[4742]: I0317 11:54:16.472083 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g8xmv" podStartSLOduration=3.021132676 podStartE2EDuration="6.471881844s" podCreationTimestamp="2026-03-17 11:54:10 +0000 UTC" firstStartedPulling="2026-03-17 11:54:12.350451347 +0000 UTC m=+2555.476579105" lastFinishedPulling="2026-03-17 11:54:15.801200505 +0000 UTC m=+2558.927328273" observedRunningTime="2026-03-17 11:54:16.464665855 +0000 UTC m=+2559.590793623" watchObservedRunningTime="2026-03-17 11:54:16.471881844 +0000 UTC m=+2559.598009612" Mar 17 11:54:17 crc kubenswrapper[4742]: I0317 11:54:17.423504 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xfqld" event={"ID":"0b2be836-5107-4b16-9210-594dfd932c40","Type":"ContainerStarted","Data":"08fdf4fe9b55465cbb76f0ff413afeba02812bb99fb8c722df355c43cdd85867"} Mar 17 11:54:17 crc kubenswrapper[4742]: I0317 11:54:17.462240 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xfqld" podStartSLOduration=2.9946034470000003 podStartE2EDuration="5.462214233s" podCreationTimestamp="2026-03-17 11:54:12 +0000 UTC" firstStartedPulling="2026-03-17 11:54:14.372763636 +0000 UTC m=+2557.498891424" lastFinishedPulling="2026-03-17 11:54:16.840374442 +0000 UTC m=+2559.966502210" observedRunningTime="2026-03-17 11:54:17.451741274 +0000 UTC m=+2560.577869032" watchObservedRunningTime="2026-03-17 11:54:17.462214233 +0000 UTC m=+2560.588342031" Mar 17 11:54:19 crc kubenswrapper[4742]: I0317 11:54:19.663173 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:54:19 crc kubenswrapper[4742]: E0317 11:54:19.664666 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:54:20 crc kubenswrapper[4742]: I0317 11:54:20.926493 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:20 crc kubenswrapper[4742]: I0317 11:54:20.926827 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:21 crc kubenswrapper[4742]: I0317 11:54:21.024154 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:21 crc kubenswrapper[4742]: I0317 11:54:21.551540 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:21 crc kubenswrapper[4742]: I0317 11:54:21.944816 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8xmv"] Mar 17 11:54:23 crc kubenswrapper[4742]: I0317 11:54:23.304172 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:23 crc kubenswrapper[4742]: I0317 11:54:23.304705 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:23 crc kubenswrapper[4742]: I0317 11:54:23.357265 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:23 crc kubenswrapper[4742]: I0317 11:54:23.491473 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g8xmv" podUID="31680d2d-dcd5-4ecf-a19b-246b69d584e5" containerName="registry-server" containerID="cri-o://433fa45a6b4d763fbc6481ea93cc03c7fe484c45020bf19a21d315bbf6ea9c40" gracePeriod=2 Mar 17 11:54:23 crc kubenswrapper[4742]: I0317 11:54:23.547633 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:23 crc kubenswrapper[4742]: I0317 11:54:23.941961 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.097652 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31680d2d-dcd5-4ecf-a19b-246b69d584e5-utilities\") pod \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\" (UID: \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\") " Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.097941 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f626q\" (UniqueName: \"kubernetes.io/projected/31680d2d-dcd5-4ecf-a19b-246b69d584e5-kube-api-access-f626q\") pod \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\" (UID: \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\") " Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.097977 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31680d2d-dcd5-4ecf-a19b-246b69d584e5-catalog-content\") pod \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\" (UID: \"31680d2d-dcd5-4ecf-a19b-246b69d584e5\") " Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.098439 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31680d2d-dcd5-4ecf-a19b-246b69d584e5-utilities" (OuterVolumeSpecName: "utilities") pod "31680d2d-dcd5-4ecf-a19b-246b69d584e5" (UID: "31680d2d-dcd5-4ecf-a19b-246b69d584e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.105303 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31680d2d-dcd5-4ecf-a19b-246b69d584e5-kube-api-access-f626q" (OuterVolumeSpecName: "kube-api-access-f626q") pod "31680d2d-dcd5-4ecf-a19b-246b69d584e5" (UID: "31680d2d-dcd5-4ecf-a19b-246b69d584e5"). InnerVolumeSpecName "kube-api-access-f626q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.199986 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f626q\" (UniqueName: \"kubernetes.io/projected/31680d2d-dcd5-4ecf-a19b-246b69d584e5-kube-api-access-f626q\") on node \"crc\" DevicePath \"\"" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.200021 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31680d2d-dcd5-4ecf-a19b-246b69d584e5-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.226390 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31680d2d-dcd5-4ecf-a19b-246b69d584e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31680d2d-dcd5-4ecf-a19b-246b69d584e5" (UID: "31680d2d-dcd5-4ecf-a19b-246b69d584e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.302298 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31680d2d-dcd5-4ecf-a19b-246b69d584e5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.502657 4742 generic.go:334] "Generic (PLEG): container finished" podID="31680d2d-dcd5-4ecf-a19b-246b69d584e5" containerID="433fa45a6b4d763fbc6481ea93cc03c7fe484c45020bf19a21d315bbf6ea9c40" exitCode=0 Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.502711 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8xmv" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.502757 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8xmv" event={"ID":"31680d2d-dcd5-4ecf-a19b-246b69d584e5","Type":"ContainerDied","Data":"433fa45a6b4d763fbc6481ea93cc03c7fe484c45020bf19a21d315bbf6ea9c40"} Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.502784 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8xmv" event={"ID":"31680d2d-dcd5-4ecf-a19b-246b69d584e5","Type":"ContainerDied","Data":"6013d72deef3e72ec87d68947ac1cb99d5e7ab7d55eab95af390283fbcc1eeb4"} Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.502804 4742 scope.go:117] "RemoveContainer" containerID="433fa45a6b4d763fbc6481ea93cc03c7fe484c45020bf19a21d315bbf6ea9c40" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.534437 4742 scope.go:117] "RemoveContainer" containerID="b593dfc5a9338fcede8152194cec4dfa76cb6f55430e5988022d2c2b04b743e2" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.541806 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8xmv"] Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.549769 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g8xmv"] Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.571079 4742 scope.go:117] "RemoveContainer" containerID="fca018b4ea5dc4cad992d153caef56311427aed8f280b91e075763eb3edfaa7b" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.603064 4742 scope.go:117] "RemoveContainer" containerID="433fa45a6b4d763fbc6481ea93cc03c7fe484c45020bf19a21d315bbf6ea9c40" Mar 17 11:54:24 crc kubenswrapper[4742]: E0317 11:54:24.603556 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"433fa45a6b4d763fbc6481ea93cc03c7fe484c45020bf19a21d315bbf6ea9c40\": container with ID starting with 433fa45a6b4d763fbc6481ea93cc03c7fe484c45020bf19a21d315bbf6ea9c40 not found: ID does not exist" containerID="433fa45a6b4d763fbc6481ea93cc03c7fe484c45020bf19a21d315bbf6ea9c40" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.603715 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"433fa45a6b4d763fbc6481ea93cc03c7fe484c45020bf19a21d315bbf6ea9c40"} err="failed to get container status \"433fa45a6b4d763fbc6481ea93cc03c7fe484c45020bf19a21d315bbf6ea9c40\": rpc error: code = NotFound desc = could not find container \"433fa45a6b4d763fbc6481ea93cc03c7fe484c45020bf19a21d315bbf6ea9c40\": container with ID starting with 433fa45a6b4d763fbc6481ea93cc03c7fe484c45020bf19a21d315bbf6ea9c40 not found: ID does not exist" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.603819 4742 scope.go:117] "RemoveContainer" containerID="b593dfc5a9338fcede8152194cec4dfa76cb6f55430e5988022d2c2b04b743e2" Mar 17 11:54:24 crc kubenswrapper[4742]: E0317 11:54:24.604200 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b593dfc5a9338fcede8152194cec4dfa76cb6f55430e5988022d2c2b04b743e2\": container with ID starting with b593dfc5a9338fcede8152194cec4dfa76cb6f55430e5988022d2c2b04b743e2 not found: ID does not exist" containerID="b593dfc5a9338fcede8152194cec4dfa76cb6f55430e5988022d2c2b04b743e2" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.604235 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b593dfc5a9338fcede8152194cec4dfa76cb6f55430e5988022d2c2b04b743e2"} err="failed to get container status \"b593dfc5a9338fcede8152194cec4dfa76cb6f55430e5988022d2c2b04b743e2\": rpc error: code = NotFound desc = could not find container \"b593dfc5a9338fcede8152194cec4dfa76cb6f55430e5988022d2c2b04b743e2\": container with ID starting with b593dfc5a9338fcede8152194cec4dfa76cb6f55430e5988022d2c2b04b743e2 not found: ID does not exist" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.604253 4742 scope.go:117] "RemoveContainer" containerID="fca018b4ea5dc4cad992d153caef56311427aed8f280b91e075763eb3edfaa7b" Mar 17 11:54:24 crc kubenswrapper[4742]: E0317 11:54:24.604505 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fca018b4ea5dc4cad992d153caef56311427aed8f280b91e075763eb3edfaa7b\": container with ID starting with fca018b4ea5dc4cad992d153caef56311427aed8f280b91e075763eb3edfaa7b not found: ID does not exist" containerID="fca018b4ea5dc4cad992d153caef56311427aed8f280b91e075763eb3edfaa7b" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.604542 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca018b4ea5dc4cad992d153caef56311427aed8f280b91e075763eb3edfaa7b"} err="failed to get container status \"fca018b4ea5dc4cad992d153caef56311427aed8f280b91e075763eb3edfaa7b\": rpc error: code = NotFound desc = could not find container \"fca018b4ea5dc4cad992d153caef56311427aed8f280b91e075763eb3edfaa7b\": container with ID starting with fca018b4ea5dc4cad992d153caef56311427aed8f280b91e075763eb3edfaa7b not found: ID does not exist" Mar 17 11:54:24 crc kubenswrapper[4742]: I0317 11:54:24.676352 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31680d2d-dcd5-4ecf-a19b-246b69d584e5" path="/var/lib/kubelet/pods/31680d2d-dcd5-4ecf-a19b-246b69d584e5/volumes" Mar 17 11:54:25 crc kubenswrapper[4742]: I0317 11:54:25.750771 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xfqld"] Mar 17 11:54:25 crc kubenswrapper[4742]: I0317 11:54:25.751169 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xfqld" podUID="0b2be836-5107-4b16-9210-594dfd932c40" containerName="registry-server" containerID="cri-o://08fdf4fe9b55465cbb76f0ff413afeba02812bb99fb8c722df355c43cdd85867" gracePeriod=2 Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.246935 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.342898 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2be836-5107-4b16-9210-594dfd932c40-utilities\") pod \"0b2be836-5107-4b16-9210-594dfd932c40\" (UID: \"0b2be836-5107-4b16-9210-594dfd932c40\") " Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.343075 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrvlt\" (UniqueName: \"kubernetes.io/projected/0b2be836-5107-4b16-9210-594dfd932c40-kube-api-access-qrvlt\") pod \"0b2be836-5107-4b16-9210-594dfd932c40\" (UID: \"0b2be836-5107-4b16-9210-594dfd932c40\") " Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.343127 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2be836-5107-4b16-9210-594dfd932c40-catalog-content\") pod \"0b2be836-5107-4b16-9210-594dfd932c40\" (UID: \"0b2be836-5107-4b16-9210-594dfd932c40\") " Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.343807 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b2be836-5107-4b16-9210-594dfd932c40-utilities" (OuterVolumeSpecName: "utilities") pod "0b2be836-5107-4b16-9210-594dfd932c40" (UID: "0b2be836-5107-4b16-9210-594dfd932c40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.349808 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b2be836-5107-4b16-9210-594dfd932c40-kube-api-access-qrvlt" (OuterVolumeSpecName: "kube-api-access-qrvlt") pod "0b2be836-5107-4b16-9210-594dfd932c40" (UID: "0b2be836-5107-4b16-9210-594dfd932c40"). InnerVolumeSpecName "kube-api-access-qrvlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.382595 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b2be836-5107-4b16-9210-594dfd932c40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b2be836-5107-4b16-9210-594dfd932c40" (UID: "0b2be836-5107-4b16-9210-594dfd932c40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.445097 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrvlt\" (UniqueName: \"kubernetes.io/projected/0b2be836-5107-4b16-9210-594dfd932c40-kube-api-access-qrvlt\") on node \"crc\" DevicePath \"\"" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.445129 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2be836-5107-4b16-9210-594dfd932c40-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.445140 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2be836-5107-4b16-9210-594dfd932c40-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.530694 4742 generic.go:334] "Generic (PLEG): container finished" podID="0b2be836-5107-4b16-9210-594dfd932c40" containerID="08fdf4fe9b55465cbb76f0ff413afeba02812bb99fb8c722df355c43cdd85867" exitCode=0 Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.530790 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xfqld" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.530821 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xfqld" event={"ID":"0b2be836-5107-4b16-9210-594dfd932c40","Type":"ContainerDied","Data":"08fdf4fe9b55465cbb76f0ff413afeba02812bb99fb8c722df355c43cdd85867"} Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.531291 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xfqld" event={"ID":"0b2be836-5107-4b16-9210-594dfd932c40","Type":"ContainerDied","Data":"95e73ad8daaf0618679042d0955e1c7de609f28d18d2a844f91124fe86b39330"} Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.531334 4742 scope.go:117] "RemoveContainer" containerID="08fdf4fe9b55465cbb76f0ff413afeba02812bb99fb8c722df355c43cdd85867" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.570092 4742 scope.go:117] "RemoveContainer" containerID="f32e8563f67d5389c5735832753efdea170e59145f3fee7dfd3b854ee9502c16" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.575673 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xfqld"] Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.583668 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xfqld"] Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.597138 4742 scope.go:117] "RemoveContainer" containerID="77d2920c565f1fad45d6515458e01311c687958bd2f408f10174f8944e6f8e94" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.646778 4742 scope.go:117] "RemoveContainer" containerID="08fdf4fe9b55465cbb76f0ff413afeba02812bb99fb8c722df355c43cdd85867" Mar 17 11:54:26 crc kubenswrapper[4742]: E0317 11:54:26.647372 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08fdf4fe9b55465cbb76f0ff413afeba02812bb99fb8c722df355c43cdd85867\": container with ID starting with 08fdf4fe9b55465cbb76f0ff413afeba02812bb99fb8c722df355c43cdd85867 not found: ID does not exist" containerID="08fdf4fe9b55465cbb76f0ff413afeba02812bb99fb8c722df355c43cdd85867" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.647420 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08fdf4fe9b55465cbb76f0ff413afeba02812bb99fb8c722df355c43cdd85867"} err="failed to get container status \"08fdf4fe9b55465cbb76f0ff413afeba02812bb99fb8c722df355c43cdd85867\": rpc error: code = NotFound desc = could not find container \"08fdf4fe9b55465cbb76f0ff413afeba02812bb99fb8c722df355c43cdd85867\": container with ID starting with 08fdf4fe9b55465cbb76f0ff413afeba02812bb99fb8c722df355c43cdd85867 not found: ID does not exist" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.647447 4742 scope.go:117] "RemoveContainer" containerID="f32e8563f67d5389c5735832753efdea170e59145f3fee7dfd3b854ee9502c16" Mar 17 11:54:26 crc kubenswrapper[4742]: E0317 11:54:26.648935 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f32e8563f67d5389c5735832753efdea170e59145f3fee7dfd3b854ee9502c16\": container with ID starting with f32e8563f67d5389c5735832753efdea170e59145f3fee7dfd3b854ee9502c16 not found: ID does not exist" containerID="f32e8563f67d5389c5735832753efdea170e59145f3fee7dfd3b854ee9502c16" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.648970 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f32e8563f67d5389c5735832753efdea170e59145f3fee7dfd3b854ee9502c16"} err="failed to get container status \"f32e8563f67d5389c5735832753efdea170e59145f3fee7dfd3b854ee9502c16\": rpc error: code = NotFound desc = could not find container \"f32e8563f67d5389c5735832753efdea170e59145f3fee7dfd3b854ee9502c16\": container with ID starting with f32e8563f67d5389c5735832753efdea170e59145f3fee7dfd3b854ee9502c16 not found: ID does not exist" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.648990 4742 scope.go:117] "RemoveContainer" containerID="77d2920c565f1fad45d6515458e01311c687958bd2f408f10174f8944e6f8e94" Mar 17 11:54:26 crc kubenswrapper[4742]: E0317 11:54:26.649279 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77d2920c565f1fad45d6515458e01311c687958bd2f408f10174f8944e6f8e94\": container with ID starting with 77d2920c565f1fad45d6515458e01311c687958bd2f408f10174f8944e6f8e94 not found: ID does not exist" containerID="77d2920c565f1fad45d6515458e01311c687958bd2f408f10174f8944e6f8e94" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.649308 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d2920c565f1fad45d6515458e01311c687958bd2f408f10174f8944e6f8e94"} err="failed to get container status \"77d2920c565f1fad45d6515458e01311c687958bd2f408f10174f8944e6f8e94\": rpc error: code = NotFound desc = could not find container \"77d2920c565f1fad45d6515458e01311c687958bd2f408f10174f8944e6f8e94\": container with ID starting with 77d2920c565f1fad45d6515458e01311c687958bd2f408f10174f8944e6f8e94 not found: ID does not exist" Mar 17 11:54:26 crc kubenswrapper[4742]: I0317 11:54:26.672306 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b2be836-5107-4b16-9210-594dfd932c40" path="/var/lib/kubelet/pods/0b2be836-5107-4b16-9210-594dfd932c40/volumes" Mar 17 11:54:34 crc kubenswrapper[4742]: I0317 11:54:34.663456 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:54:34 crc kubenswrapper[4742]: E0317 11:54:34.664211 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 11:54:48 crc kubenswrapper[4742]: I0317 11:54:48.673666 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:54:49 crc kubenswrapper[4742]: I0317 11:54:49.791674 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"1750f423eed5ff73f33cbeaf0c7b4d19bec40d4ee6133f2c8141f4cdf4bd6cdf"} Mar 17 11:54:55 crc kubenswrapper[4742]: I0317 11:54:55.990142 4742 scope.go:117] "RemoveContainer" containerID="f4cae3987486b6b32db2c0eb80fb9aad24c86de206830bf9ad232970128bd8e3" Mar 17 11:55:27 crc kubenswrapper[4742]: I0317 11:55:27.168377 4742 generic.go:334] "Generic (PLEG): container finished" podID="6468192a-58e3-4b66-9551-1d67dc93f0ae" containerID="efc6ca66e9e7b010b4631451b7caecc164b8c97910f41a336c99ea44e1f0ea83" exitCode=0 Mar 17 11:55:27 crc kubenswrapper[4742]: I0317 11:55:27.168487 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" event={"ID":"6468192a-58e3-4b66-9551-1d67dc93f0ae","Type":"ContainerDied","Data":"efc6ca66e9e7b010b4631451b7caecc164b8c97910f41a336c99ea44e1f0ea83"} Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.628201 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.777447 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-inventory\") pod \"6468192a-58e3-4b66-9551-1d67dc93f0ae\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.777497 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-extra-config-0\") pod \"6468192a-58e3-4b66-9551-1d67dc93f0ae\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.777517 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-2\") pod \"6468192a-58e3-4b66-9551-1d67dc93f0ae\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.777597 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-1\") pod \"6468192a-58e3-4b66-9551-1d67dc93f0ae\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.777662 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-0\") pod \"6468192a-58e3-4b66-9551-1d67dc93f0ae\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.777692 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mnk6\" (UniqueName: \"kubernetes.io/projected/6468192a-58e3-4b66-9551-1d67dc93f0ae-kube-api-access-7mnk6\") pod \"6468192a-58e3-4b66-9551-1d67dc93f0ae\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.777723 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-combined-ca-bundle\") pod \"6468192a-58e3-4b66-9551-1d67dc93f0ae\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.777758 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-migration-ssh-key-1\") pod \"6468192a-58e3-4b66-9551-1d67dc93f0ae\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.777840 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-ssh-key-openstack-edpm-ipam\") pod \"6468192a-58e3-4b66-9551-1d67dc93f0ae\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.777875 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-3\") pod \"6468192a-58e3-4b66-9551-1d67dc93f0ae\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.777934 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-migration-ssh-key-0\") pod \"6468192a-58e3-4b66-9551-1d67dc93f0ae\" (UID: \"6468192a-58e3-4b66-9551-1d67dc93f0ae\") " Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.818031 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6468192a-58e3-4b66-9551-1d67dc93f0ae" (UID: "6468192a-58e3-4b66-9551-1d67dc93f0ae"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.826387 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6468192a-58e3-4b66-9551-1d67dc93f0ae-kube-api-access-7mnk6" (OuterVolumeSpecName: "kube-api-access-7mnk6") pod "6468192a-58e3-4b66-9551-1d67dc93f0ae" (UID: "6468192a-58e3-4b66-9551-1d67dc93f0ae"). InnerVolumeSpecName "kube-api-access-7mnk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.854744 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6468192a-58e3-4b66-9551-1d67dc93f0ae" (UID: "6468192a-58e3-4b66-9551-1d67dc93f0ae"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.869074 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "6468192a-58e3-4b66-9551-1d67dc93f0ae" (UID: "6468192a-58e3-4b66-9551-1d67dc93f0ae"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.871324 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "6468192a-58e3-4b66-9551-1d67dc93f0ae" (UID: "6468192a-58e3-4b66-9551-1d67dc93f0ae"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.874261 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6468192a-58e3-4b66-9551-1d67dc93f0ae" (UID: "6468192a-58e3-4b66-9551-1d67dc93f0ae"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.874187 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "6468192a-58e3-4b66-9551-1d67dc93f0ae" (UID: "6468192a-58e3-4b66-9551-1d67dc93f0ae"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.879074 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "6468192a-58e3-4b66-9551-1d67dc93f0ae" (UID: "6468192a-58e3-4b66-9551-1d67dc93f0ae"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.879947 4742 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.879970 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mnk6\" (UniqueName: \"kubernetes.io/projected/6468192a-58e3-4b66-9551-1d67dc93f0ae-kube-api-access-7mnk6\") on node \"crc\" DevicePath \"\"" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.879979 4742 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.879988 4742 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.879998 4742 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.880006 4742 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.880014 4742 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.880023 4742 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.880243 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "6468192a-58e3-4b66-9551-1d67dc93f0ae" (UID: "6468192a-58e3-4b66-9551-1d67dc93f0ae"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.882060 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6468192a-58e3-4b66-9551-1d67dc93f0ae" (UID: "6468192a-58e3-4b66-9551-1d67dc93f0ae"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.888361 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-inventory" (OuterVolumeSpecName: "inventory") pod "6468192a-58e3-4b66-9551-1d67dc93f0ae" (UID: "6468192a-58e3-4b66-9551-1d67dc93f0ae"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.982017 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.982041 4742 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 17 11:55:28 crc kubenswrapper[4742]: I0317 11:55:28.982050 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6468192a-58e3-4b66-9551-1d67dc93f0ae-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.191371 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" event={"ID":"6468192a-58e3-4b66-9551-1d67dc93f0ae","Type":"ContainerDied","Data":"154b6ce5d54fb1cec325c0b1b1839372710f1d52250d772935184c6d9ecf5e5c"} Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.191408 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="154b6ce5d54fb1cec325c0b1b1839372710f1d52250d772935184c6d9ecf5e5c" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.191470 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-76jn7" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.337882 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw"] Mar 17 11:55:29 crc kubenswrapper[4742]: E0317 11:55:29.338295 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6468192a-58e3-4b66-9551-1d67dc93f0ae" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.338316 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="6468192a-58e3-4b66-9551-1d67dc93f0ae" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 17 11:55:29 crc kubenswrapper[4742]: E0317 11:55:29.338335 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31680d2d-dcd5-4ecf-a19b-246b69d584e5" containerName="extract-content" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.338343 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="31680d2d-dcd5-4ecf-a19b-246b69d584e5" containerName="extract-content" Mar 17 11:55:29 crc kubenswrapper[4742]: E0317 11:55:29.338358 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2be836-5107-4b16-9210-594dfd932c40" containerName="extract-content" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.338366 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2be836-5107-4b16-9210-594dfd932c40" containerName="extract-content" Mar 17 11:55:29 crc kubenswrapper[4742]: E0317 11:55:29.338383 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2be836-5107-4b16-9210-594dfd932c40" containerName="extract-utilities" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.338392 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2be836-5107-4b16-9210-594dfd932c40" containerName="extract-utilities" Mar 17 11:55:29 crc kubenswrapper[4742]: E0317 11:55:29.338402 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2be836-5107-4b16-9210-594dfd932c40" containerName="registry-server" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.338411 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2be836-5107-4b16-9210-594dfd932c40" containerName="registry-server" Mar 17 11:55:29 crc kubenswrapper[4742]: E0317 11:55:29.338436 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31680d2d-dcd5-4ecf-a19b-246b69d584e5" containerName="extract-utilities" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.338445 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="31680d2d-dcd5-4ecf-a19b-246b69d584e5" containerName="extract-utilities" Mar 17 11:55:29 crc kubenswrapper[4742]: E0317 11:55:29.338457 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31680d2d-dcd5-4ecf-a19b-246b69d584e5" containerName="registry-server" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.338465 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="31680d2d-dcd5-4ecf-a19b-246b69d584e5" containerName="registry-server" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.338680 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="31680d2d-dcd5-4ecf-a19b-246b69d584e5" containerName="registry-server" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.338709 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b2be836-5107-4b16-9210-594dfd932c40" containerName="registry-server" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.338731 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="6468192a-58e3-4b66-9551-1d67dc93f0ae" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.339469 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.348049 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-n8b7p" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.348070 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.348133 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.348137 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.348063 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.349890 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw"] Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.490643 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.490882 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.490972 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.491096 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.491138 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fspfb\" (UniqueName: \"kubernetes.io/projected/24003f05-4f7d-443d-8a19-8162dae339a2-kube-api-access-fspfb\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.491237 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.491275 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.593138 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.593436 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.593484 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.593515 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fspfb\" (UniqueName: \"kubernetes.io/projected/24003f05-4f7d-443d-8a19-8162dae339a2-kube-api-access-fspfb\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.593616 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.593642 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.593736 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.597718 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.598848 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.600036 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.601188 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.602707 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.612820 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.617160 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fspfb\" (UniqueName: \"kubernetes.io/projected/24003f05-4f7d-443d-8a19-8162dae339a2-kube-api-access-fspfb\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:29 crc kubenswrapper[4742]: I0317 11:55:29.673697 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:55:30 crc kubenswrapper[4742]: I0317 11:55:30.315720 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw"] Mar 17 11:55:31 crc kubenswrapper[4742]: I0317 11:55:31.208713 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" event={"ID":"24003f05-4f7d-443d-8a19-8162dae339a2","Type":"ContainerStarted","Data":"4a8e03439bbaa326a1b03173dcb86d956aa885c7b0ea553cde2e926e2015ab21"} Mar 17 11:55:31 crc kubenswrapper[4742]: I0317 11:55:31.209301 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" event={"ID":"24003f05-4f7d-443d-8a19-8162dae339a2","Type":"ContainerStarted","Data":"815943fe885d92b5fb6fed59db43b6260e27cde184a602e40ddf96263515dacf"} Mar 17 11:55:31 crc kubenswrapper[4742]: I0317 11:55:31.225734 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" podStartSLOduration=1.759357668 podStartE2EDuration="2.225717608s" podCreationTimestamp="2026-03-17 11:55:29 +0000 UTC" firstStartedPulling="2026-03-17 11:55:30.312164377 +0000 UTC m=+2633.438292135" lastFinishedPulling="2026-03-17 11:55:30.778524297 +0000 UTC m=+2633.904652075" observedRunningTime="2026-03-17 11:55:31.223304451 +0000 UTC m=+2634.349432199" watchObservedRunningTime="2026-03-17 11:55:31.225717608 +0000 UTC m=+2634.351845366" Mar 17 11:56:00 crc kubenswrapper[4742]: I0317 11:56:00.160086 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562476-lq8cp"] Mar 17 11:56:00 crc kubenswrapper[4742]: I0317 11:56:00.162676 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562476-lq8cp" Mar 17 11:56:00 crc kubenswrapper[4742]: I0317 11:56:00.165624 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:56:00 crc kubenswrapper[4742]: I0317 11:56:00.167452 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:56:00 crc kubenswrapper[4742]: I0317 11:56:00.170383 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:56:00 crc kubenswrapper[4742]: I0317 11:56:00.174357 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562476-lq8cp"] Mar 17 11:56:00 crc kubenswrapper[4742]: I0317 11:56:00.245580 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsd8d\" (UniqueName: \"kubernetes.io/projected/c08c12ce-725a-4532-990a-136c1cf8c8a3-kube-api-access-qsd8d\") pod \"auto-csr-approver-29562476-lq8cp\" (UID: \"c08c12ce-725a-4532-990a-136c1cf8c8a3\") " pod="openshift-infra/auto-csr-approver-29562476-lq8cp" Mar 17 11:56:00 crc kubenswrapper[4742]: I0317 11:56:00.346929 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsd8d\" (UniqueName: \"kubernetes.io/projected/c08c12ce-725a-4532-990a-136c1cf8c8a3-kube-api-access-qsd8d\") pod \"auto-csr-approver-29562476-lq8cp\" (UID: \"c08c12ce-725a-4532-990a-136c1cf8c8a3\") " pod="openshift-infra/auto-csr-approver-29562476-lq8cp" Mar 17 11:56:00 crc kubenswrapper[4742]: I0317 11:56:00.381125 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsd8d\" (UniqueName: \"kubernetes.io/projected/c08c12ce-725a-4532-990a-136c1cf8c8a3-kube-api-access-qsd8d\") pod \"auto-csr-approver-29562476-lq8cp\" (UID: \"c08c12ce-725a-4532-990a-136c1cf8c8a3\") " pod="openshift-infra/auto-csr-approver-29562476-lq8cp" Mar 17 11:56:00 crc kubenswrapper[4742]: I0317 11:56:00.487331 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562476-lq8cp" Mar 17 11:56:00 crc kubenswrapper[4742]: W0317 11:56:00.988785 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc08c12ce_725a_4532_990a_136c1cf8c8a3.slice/crio-6479692dd54e93fef2648508ded43a2e303539c30ddcfeb57f5ecaed225eddfc WatchSource:0}: Error finding container 6479692dd54e93fef2648508ded43a2e303539c30ddcfeb57f5ecaed225eddfc: Status 404 returned error can't find the container with id 6479692dd54e93fef2648508ded43a2e303539c30ddcfeb57f5ecaed225eddfc Mar 17 11:56:00 crc kubenswrapper[4742]: I0317 11:56:00.992176 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562476-lq8cp"] Mar 17 11:56:01 crc kubenswrapper[4742]: I0317 11:56:01.538270 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562476-lq8cp" event={"ID":"c08c12ce-725a-4532-990a-136c1cf8c8a3","Type":"ContainerStarted","Data":"6479692dd54e93fef2648508ded43a2e303539c30ddcfeb57f5ecaed225eddfc"} Mar 17 11:56:03 crc kubenswrapper[4742]: I0317 11:56:03.561388 4742 generic.go:334] "Generic (PLEG): container finished" podID="c08c12ce-725a-4532-990a-136c1cf8c8a3" containerID="784c9eb64c3ba726b37a43a52b288409f503d47fc21d7380a1bfd3e5b8a2aac0" exitCode=0 Mar 17 11:56:03 crc kubenswrapper[4742]: I0317 11:56:03.561541 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562476-lq8cp" event={"ID":"c08c12ce-725a-4532-990a-136c1cf8c8a3","Type":"ContainerDied","Data":"784c9eb64c3ba726b37a43a52b288409f503d47fc21d7380a1bfd3e5b8a2aac0"} Mar 17 11:56:05 crc kubenswrapper[4742]: I0317 11:56:05.051813 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562476-lq8cp" Mar 17 11:56:05 crc kubenswrapper[4742]: I0317 11:56:05.245612 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsd8d\" (UniqueName: \"kubernetes.io/projected/c08c12ce-725a-4532-990a-136c1cf8c8a3-kube-api-access-qsd8d\") pod \"c08c12ce-725a-4532-990a-136c1cf8c8a3\" (UID: \"c08c12ce-725a-4532-990a-136c1cf8c8a3\") " Mar 17 11:56:05 crc kubenswrapper[4742]: I0317 11:56:05.252395 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c08c12ce-725a-4532-990a-136c1cf8c8a3-kube-api-access-qsd8d" (OuterVolumeSpecName: "kube-api-access-qsd8d") pod "c08c12ce-725a-4532-990a-136c1cf8c8a3" (UID: "c08c12ce-725a-4532-990a-136c1cf8c8a3"). InnerVolumeSpecName "kube-api-access-qsd8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:56:05 crc kubenswrapper[4742]: I0317 11:56:05.348608 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsd8d\" (UniqueName: \"kubernetes.io/projected/c08c12ce-725a-4532-990a-136c1cf8c8a3-kube-api-access-qsd8d\") on node \"crc\" DevicePath \"\"" Mar 17 11:56:05 crc kubenswrapper[4742]: I0317 11:56:05.585877 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562476-lq8cp" event={"ID":"c08c12ce-725a-4532-990a-136c1cf8c8a3","Type":"ContainerDied","Data":"6479692dd54e93fef2648508ded43a2e303539c30ddcfeb57f5ecaed225eddfc"} Mar 17 11:56:05 crc kubenswrapper[4742]: I0317 11:56:05.585935 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6479692dd54e93fef2648508ded43a2e303539c30ddcfeb57f5ecaed225eddfc" Mar 17 11:56:05 crc kubenswrapper[4742]: I0317 11:56:05.586002 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562476-lq8cp" Mar 17 11:56:06 crc kubenswrapper[4742]: I0317 11:56:06.137443 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562470-l9qjt"] Mar 17 11:56:06 crc kubenswrapper[4742]: I0317 11:56:06.149680 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562470-l9qjt"] Mar 17 11:56:06 crc kubenswrapper[4742]: I0317 11:56:06.675900 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f447ef44-8f08-4573-b884-fa3a098bc306" path="/var/lib/kubelet/pods/f447ef44-8f08-4573-b884-fa3a098bc306/volumes" Mar 17 11:56:48 crc kubenswrapper[4742]: I0317 11:56:48.044655 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:56:48 crc kubenswrapper[4742]: I0317 11:56:48.045415 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:56:56 crc kubenswrapper[4742]: I0317 11:56:56.150171 4742 scope.go:117] "RemoveContainer" containerID="a6e05879ddb53b0bf63695538fe9524047bd7010c4daf32e55e9a8e14977ea5b" Mar 17 11:57:18 crc kubenswrapper[4742]: I0317 11:57:18.045020 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:57:18 crc kubenswrapper[4742]: I0317 11:57:18.045862 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:57:48 crc kubenswrapper[4742]: I0317 11:57:48.043993 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:57:48 crc kubenswrapper[4742]: I0317 11:57:48.046855 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:57:48 crc kubenswrapper[4742]: I0317 11:57:48.047063 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 11:57:48 crc kubenswrapper[4742]: I0317 11:57:48.048678 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1750f423eed5ff73f33cbeaf0c7b4d19bec40d4ee6133f2c8141f4cdf4bd6cdf"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 11:57:48 crc kubenswrapper[4742]: I0317 11:57:48.048873 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://1750f423eed5ff73f33cbeaf0c7b4d19bec40d4ee6133f2c8141f4cdf4bd6cdf" gracePeriod=600 Mar 17 11:57:48 crc kubenswrapper[4742]: I0317 11:57:48.824035 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="1750f423eed5ff73f33cbeaf0c7b4d19bec40d4ee6133f2c8141f4cdf4bd6cdf" exitCode=0 Mar 17 11:57:48 crc kubenswrapper[4742]: I0317 11:57:48.824101 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"1750f423eed5ff73f33cbeaf0c7b4d19bec40d4ee6133f2c8141f4cdf4bd6cdf"} Mar 17 11:57:48 crc kubenswrapper[4742]: I0317 11:57:48.824368 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50"} Mar 17 11:57:48 crc kubenswrapper[4742]: I0317 11:57:48.824392 4742 scope.go:117] "RemoveContainer" containerID="f3a27115b85cbc39b3abe19df35d5392f351c27ea5f3d43e6c8cefb9e7d0e3ca" Mar 17 11:58:00 crc kubenswrapper[4742]: I0317 11:58:00.150322 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562478-47wq6"] Mar 17 11:58:00 crc kubenswrapper[4742]: E0317 11:58:00.151351 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08c12ce-725a-4532-990a-136c1cf8c8a3" containerName="oc" Mar 17 11:58:00 crc kubenswrapper[4742]: I0317 11:58:00.151367 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08c12ce-725a-4532-990a-136c1cf8c8a3" containerName="oc" Mar 17 11:58:00 crc kubenswrapper[4742]: I0317 11:58:00.151660 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="c08c12ce-725a-4532-990a-136c1cf8c8a3" containerName="oc" Mar 17 11:58:00 crc kubenswrapper[4742]: I0317 11:58:00.152487 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562478-47wq6" Mar 17 11:58:00 crc kubenswrapper[4742]: I0317 11:58:00.154045 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 11:58:00 crc kubenswrapper[4742]: I0317 11:58:00.154129 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 11:58:00 crc kubenswrapper[4742]: I0317 11:58:00.155580 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 11:58:00 crc kubenswrapper[4742]: I0317 11:58:00.162992 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562478-47wq6"] Mar 17 11:58:00 crc kubenswrapper[4742]: I0317 11:58:00.317284 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jnb4\" (UniqueName: \"kubernetes.io/projected/610be10f-b292-4196-bea3-8a5de3e562c3-kube-api-access-9jnb4\") pod \"auto-csr-approver-29562478-47wq6\" (UID: \"610be10f-b292-4196-bea3-8a5de3e562c3\") " pod="openshift-infra/auto-csr-approver-29562478-47wq6" Mar 17 11:58:00 crc kubenswrapper[4742]: I0317 11:58:00.419261 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jnb4\" (UniqueName: \"kubernetes.io/projected/610be10f-b292-4196-bea3-8a5de3e562c3-kube-api-access-9jnb4\") pod \"auto-csr-approver-29562478-47wq6\" (UID: \"610be10f-b292-4196-bea3-8a5de3e562c3\") " pod="openshift-infra/auto-csr-approver-29562478-47wq6" Mar 17 11:58:00 crc kubenswrapper[4742]: I0317 11:58:00.444791 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jnb4\" (UniqueName: \"kubernetes.io/projected/610be10f-b292-4196-bea3-8a5de3e562c3-kube-api-access-9jnb4\") pod \"auto-csr-approver-29562478-47wq6\" (UID: \"610be10f-b292-4196-bea3-8a5de3e562c3\") " pod="openshift-infra/auto-csr-approver-29562478-47wq6" Mar 17 11:58:00 crc kubenswrapper[4742]: I0317 11:58:00.484464 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562478-47wq6" Mar 17 11:58:01 crc kubenswrapper[4742]: W0317 11:58:01.039746 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod610be10f_b292_4196_bea3_8a5de3e562c3.slice/crio-dd4a10bf9ef7b4c3a8cd1ec3ad6eed298112ac6c92f4d4151535ec9788e707f7 WatchSource:0}: Error finding container dd4a10bf9ef7b4c3a8cd1ec3ad6eed298112ac6c92f4d4151535ec9788e707f7: Status 404 returned error can't find the container with id dd4a10bf9ef7b4c3a8cd1ec3ad6eed298112ac6c92f4d4151535ec9788e707f7 Mar 17 11:58:01 crc kubenswrapper[4742]: I0317 11:58:01.041987 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562478-47wq6"] Mar 17 11:58:01 crc kubenswrapper[4742]: I0317 11:58:01.987445 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562478-47wq6" event={"ID":"610be10f-b292-4196-bea3-8a5de3e562c3","Type":"ContainerStarted","Data":"dd4a10bf9ef7b4c3a8cd1ec3ad6eed298112ac6c92f4d4151535ec9788e707f7"} Mar 17 11:58:03 crc kubenswrapper[4742]: I0317 11:58:03.002251 4742 generic.go:334] "Generic (PLEG): container finished" podID="610be10f-b292-4196-bea3-8a5de3e562c3" containerID="2e2a9b61ee805dab5cce2a2a8e7e4a8772f1b8d7c411a195cbc4ca953f251b23" exitCode=0 Mar 17 11:58:03 crc kubenswrapper[4742]: I0317 11:58:03.002312 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562478-47wq6" event={"ID":"610be10f-b292-4196-bea3-8a5de3e562c3","Type":"ContainerDied","Data":"2e2a9b61ee805dab5cce2a2a8e7e4a8772f1b8d7c411a195cbc4ca953f251b23"} Mar 17 11:58:04 crc kubenswrapper[4742]: I0317 11:58:04.395721 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562478-47wq6" Mar 17 11:58:04 crc kubenswrapper[4742]: I0317 11:58:04.430330 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jnb4\" (UniqueName: \"kubernetes.io/projected/610be10f-b292-4196-bea3-8a5de3e562c3-kube-api-access-9jnb4\") pod \"610be10f-b292-4196-bea3-8a5de3e562c3\" (UID: \"610be10f-b292-4196-bea3-8a5de3e562c3\") " Mar 17 11:58:04 crc kubenswrapper[4742]: I0317 11:58:04.448279 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/610be10f-b292-4196-bea3-8a5de3e562c3-kube-api-access-9jnb4" (OuterVolumeSpecName: "kube-api-access-9jnb4") pod "610be10f-b292-4196-bea3-8a5de3e562c3" (UID: "610be10f-b292-4196-bea3-8a5de3e562c3"). InnerVolumeSpecName "kube-api-access-9jnb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:58:04 crc kubenswrapper[4742]: I0317 11:58:04.533347 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jnb4\" (UniqueName: \"kubernetes.io/projected/610be10f-b292-4196-bea3-8a5de3e562c3-kube-api-access-9jnb4\") on node \"crc\" DevicePath \"\"" Mar 17 11:58:05 crc kubenswrapper[4742]: I0317 11:58:05.029926 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562478-47wq6" event={"ID":"610be10f-b292-4196-bea3-8a5de3e562c3","Type":"ContainerDied","Data":"dd4a10bf9ef7b4c3a8cd1ec3ad6eed298112ac6c92f4d4151535ec9788e707f7"} Mar 17 11:58:05 crc kubenswrapper[4742]: I0317 11:58:05.029991 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd4a10bf9ef7b4c3a8cd1ec3ad6eed298112ac6c92f4d4151535ec9788e707f7" Mar 17 11:58:05 crc kubenswrapper[4742]: I0317 11:58:05.030045 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562478-47wq6" Mar 17 11:58:05 crc kubenswrapper[4742]: I0317 11:58:05.497680 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562472-mrxmh"] Mar 17 11:58:05 crc kubenswrapper[4742]: I0317 11:58:05.509689 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562472-mrxmh"] Mar 17 11:58:06 crc kubenswrapper[4742]: I0317 11:58:06.680828 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f49e6b82-e398-4429-bacc-57c2ec258328" path="/var/lib/kubelet/pods/f49e6b82-e398-4429-bacc-57c2ec258328/volumes" Mar 17 11:58:09 crc kubenswrapper[4742]: I0317 11:58:09.077193 4742 generic.go:334] "Generic (PLEG): container finished" podID="24003f05-4f7d-443d-8a19-8162dae339a2" containerID="4a8e03439bbaa326a1b03173dcb86d956aa885c7b0ea553cde2e926e2015ab21" exitCode=0 Mar 17 11:58:09 crc kubenswrapper[4742]: I0317 11:58:09.077345 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" event={"ID":"24003f05-4f7d-443d-8a19-8162dae339a2","Type":"ContainerDied","Data":"4a8e03439bbaa326a1b03173dcb86d956aa885c7b0ea553cde2e926e2015ab21"} Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.611183 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.688879 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-2\") pod \"24003f05-4f7d-443d-8a19-8162dae339a2\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.689191 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-inventory\") pod \"24003f05-4f7d-443d-8a19-8162dae339a2\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.689247 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-0\") pod \"24003f05-4f7d-443d-8a19-8162dae339a2\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.689409 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-1\") pod \"24003f05-4f7d-443d-8a19-8162dae339a2\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.689517 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-telemetry-combined-ca-bundle\") pod \"24003f05-4f7d-443d-8a19-8162dae339a2\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.689630 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ssh-key-openstack-edpm-ipam\") pod \"24003f05-4f7d-443d-8a19-8162dae339a2\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.690095 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fspfb\" (UniqueName: \"kubernetes.io/projected/24003f05-4f7d-443d-8a19-8162dae339a2-kube-api-access-fspfb\") pod \"24003f05-4f7d-443d-8a19-8162dae339a2\" (UID: \"24003f05-4f7d-443d-8a19-8162dae339a2\") " Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.695229 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "24003f05-4f7d-443d-8a19-8162dae339a2" (UID: "24003f05-4f7d-443d-8a19-8162dae339a2"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.695250 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24003f05-4f7d-443d-8a19-8162dae339a2-kube-api-access-fspfb" (OuterVolumeSpecName: "kube-api-access-fspfb") pod "24003f05-4f7d-443d-8a19-8162dae339a2" (UID: "24003f05-4f7d-443d-8a19-8162dae339a2"). InnerVolumeSpecName "kube-api-access-fspfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.718740 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "24003f05-4f7d-443d-8a19-8162dae339a2" (UID: "24003f05-4f7d-443d-8a19-8162dae339a2"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.719184 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "24003f05-4f7d-443d-8a19-8162dae339a2" (UID: "24003f05-4f7d-443d-8a19-8162dae339a2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.724794 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-inventory" (OuterVolumeSpecName: "inventory") pod "24003f05-4f7d-443d-8a19-8162dae339a2" (UID: "24003f05-4f7d-443d-8a19-8162dae339a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.736751 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "24003f05-4f7d-443d-8a19-8162dae339a2" (UID: "24003f05-4f7d-443d-8a19-8162dae339a2"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.755029 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "24003f05-4f7d-443d-8a19-8162dae339a2" (UID: "24003f05-4f7d-443d-8a19-8162dae339a2"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.792565 4742 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.792596 4742 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.792610 4742 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.792623 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.792636 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fspfb\" (UniqueName: \"kubernetes.io/projected/24003f05-4f7d-443d-8a19-8162dae339a2-kube-api-access-fspfb\") on node \"crc\" DevicePath \"\"" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.792645 4742 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 17 11:58:10 crc kubenswrapper[4742]: I0317 11:58:10.792656 4742 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/24003f05-4f7d-443d-8a19-8162dae339a2-inventory\") on node \"crc\" DevicePath \"\"" Mar 17 11:58:11 crc kubenswrapper[4742]: I0317 11:58:11.105493 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" event={"ID":"24003f05-4f7d-443d-8a19-8162dae339a2","Type":"ContainerDied","Data":"815943fe885d92b5fb6fed59db43b6260e27cde184a602e40ddf96263515dacf"} Mar 17 11:58:11 crc kubenswrapper[4742]: I0317 11:58:11.105563 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="815943fe885d92b5fb6fed59db43b6260e27cde184a602e40ddf96263515dacf" Mar 17 11:58:11 crc kubenswrapper[4742]: I0317 11:58:11.106025 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw" Mar 17 11:58:56 crc kubenswrapper[4742]: I0317 11:58:56.307159 4742 scope.go:117] "RemoveContainer" containerID="b53b3649b8d974792d06967a63d509ea06e3589d42c005a9b0b82985202dc535" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.435950 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 17 11:59:03 crc kubenswrapper[4742]: E0317 11:59:03.437092 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24003f05-4f7d-443d-8a19-8162dae339a2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.437115 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="24003f05-4f7d-443d-8a19-8162dae339a2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 17 11:59:03 crc kubenswrapper[4742]: E0317 11:59:03.437170 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="610be10f-b292-4196-bea3-8a5de3e562c3" containerName="oc" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.437180 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="610be10f-b292-4196-bea3-8a5de3e562c3" containerName="oc" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.437421 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="24003f05-4f7d-443d-8a19-8162dae339a2" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.437457 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="610be10f-b292-4196-bea3-8a5de3e562c3" containerName="oc" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.438349 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.444240 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.444304 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.448409 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-sm8hx" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.448466 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.463712 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.512261 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/cbe323de-3d55-4905-8f28-29cea959ae35-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.512520 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/cbe323de-3d55-4905-8f28-29cea959ae35-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.512626 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.512851 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbe323de-3d55-4905-8f28-29cea959ae35-config-data\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.512957 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9npq\" (UniqueName: \"kubernetes.io/projected/cbe323de-3d55-4905-8f28-29cea959ae35-kube-api-access-l9npq\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.513052 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.513124 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.513186 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cbe323de-3d55-4905-8f28-29cea959ae35-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.513256 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.615272 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/cbe323de-3d55-4905-8f28-29cea959ae35-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.615341 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/cbe323de-3d55-4905-8f28-29cea959ae35-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.615383 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.615432 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbe323de-3d55-4905-8f28-29cea959ae35-config-data\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.615461 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9npq\" (UniqueName: \"kubernetes.io/projected/cbe323de-3d55-4905-8f28-29cea959ae35-kube-api-access-l9npq\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.615499 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.615522 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.615537 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cbe323de-3d55-4905-8f28-29cea959ae35-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.615562 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.615865 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/cbe323de-3d55-4905-8f28-29cea959ae35-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.615932 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/cbe323de-3d55-4905-8f28-29cea959ae35-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.616562 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.616975 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cbe323de-3d55-4905-8f28-29cea959ae35-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.617534 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbe323de-3d55-4905-8f28-29cea959ae35-config-data\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.622900 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.623004 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.636434 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.638772 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9npq\" (UniqueName: \"kubernetes.io/projected/cbe323de-3d55-4905-8f28-29cea959ae35-kube-api-access-l9npq\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.648565 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " pod="openstack/tempest-tests-tempest" Mar 17 11:59:03 crc kubenswrapper[4742]: I0317 11:59:03.787446 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 17 11:59:04 crc kubenswrapper[4742]: I0317 11:59:04.233771 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 17 11:59:04 crc kubenswrapper[4742]: I0317 11:59:04.240702 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 11:59:04 crc kubenswrapper[4742]: I0317 11:59:04.761553 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"cbe323de-3d55-4905-8f28-29cea959ae35","Type":"ContainerStarted","Data":"be5d1e3fa7693f154bb5cce1fe0b1971babe4f088c8cd497cd7163100b9d0a60"} Mar 17 11:59:21 crc kubenswrapper[4742]: I0317 11:59:21.299310 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l5gm8"] Mar 17 11:59:21 crc kubenswrapper[4742]: I0317 11:59:21.302015 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:21 crc kubenswrapper[4742]: I0317 11:59:21.311343 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l5gm8"] Mar 17 11:59:21 crc kubenswrapper[4742]: I0317 11:59:21.407836 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68213073-9623-47e5-88fe-bf017f130a6f-utilities\") pod \"redhat-operators-l5gm8\" (UID: \"68213073-9623-47e5-88fe-bf017f130a6f\") " pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:21 crc kubenswrapper[4742]: I0317 11:59:21.407927 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x8ms\" (UniqueName: \"kubernetes.io/projected/68213073-9623-47e5-88fe-bf017f130a6f-kube-api-access-9x8ms\") pod \"redhat-operators-l5gm8\" (UID: \"68213073-9623-47e5-88fe-bf017f130a6f\") " pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:21 crc kubenswrapper[4742]: I0317 11:59:21.408115 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68213073-9623-47e5-88fe-bf017f130a6f-catalog-content\") pod \"redhat-operators-l5gm8\" (UID: \"68213073-9623-47e5-88fe-bf017f130a6f\") " pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:21 crc kubenswrapper[4742]: I0317 11:59:21.509841 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68213073-9623-47e5-88fe-bf017f130a6f-utilities\") pod \"redhat-operators-l5gm8\" (UID: \"68213073-9623-47e5-88fe-bf017f130a6f\") " pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:21 crc kubenswrapper[4742]: I0317 11:59:21.510267 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x8ms\" (UniqueName: \"kubernetes.io/projected/68213073-9623-47e5-88fe-bf017f130a6f-kube-api-access-9x8ms\") pod \"redhat-operators-l5gm8\" (UID: \"68213073-9623-47e5-88fe-bf017f130a6f\") " pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:21 crc kubenswrapper[4742]: I0317 11:59:21.510331 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68213073-9623-47e5-88fe-bf017f130a6f-catalog-content\") pod \"redhat-operators-l5gm8\" (UID: \"68213073-9623-47e5-88fe-bf017f130a6f\") " pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:21 crc kubenswrapper[4742]: I0317 11:59:21.510543 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68213073-9623-47e5-88fe-bf017f130a6f-utilities\") pod \"redhat-operators-l5gm8\" (UID: \"68213073-9623-47e5-88fe-bf017f130a6f\") " pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:21 crc kubenswrapper[4742]: I0317 11:59:21.510833 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68213073-9623-47e5-88fe-bf017f130a6f-catalog-content\") pod \"redhat-operators-l5gm8\" (UID: \"68213073-9623-47e5-88fe-bf017f130a6f\") " pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:21 crc kubenswrapper[4742]: I0317 11:59:21.534482 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x8ms\" (UniqueName: \"kubernetes.io/projected/68213073-9623-47e5-88fe-bf017f130a6f-kube-api-access-9x8ms\") pod \"redhat-operators-l5gm8\" (UID: \"68213073-9623-47e5-88fe-bf017f130a6f\") " pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:21 crc kubenswrapper[4742]: I0317 11:59:21.637466 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:23 crc kubenswrapper[4742]: I0317 11:59:23.293943 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-frbfk"] Mar 17 11:59:23 crc kubenswrapper[4742]: I0317 11:59:23.296518 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:23 crc kubenswrapper[4742]: I0317 11:59:23.326645 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frbfk"] Mar 17 11:59:23 crc kubenswrapper[4742]: I0317 11:59:23.467603 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2344603c-7cde-40db-b48c-284575cb80cc-utilities\") pod \"certified-operators-frbfk\" (UID: \"2344603c-7cde-40db-b48c-284575cb80cc\") " pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:23 crc kubenswrapper[4742]: I0317 11:59:23.467689 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2344603c-7cde-40db-b48c-284575cb80cc-catalog-content\") pod \"certified-operators-frbfk\" (UID: \"2344603c-7cde-40db-b48c-284575cb80cc\") " pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:23 crc kubenswrapper[4742]: I0317 11:59:23.467740 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpk7c\" (UniqueName: \"kubernetes.io/projected/2344603c-7cde-40db-b48c-284575cb80cc-kube-api-access-xpk7c\") pod \"certified-operators-frbfk\" (UID: \"2344603c-7cde-40db-b48c-284575cb80cc\") " pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:23 crc kubenswrapper[4742]: I0317 11:59:23.575893 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2344603c-7cde-40db-b48c-284575cb80cc-utilities\") pod \"certified-operators-frbfk\" (UID: \"2344603c-7cde-40db-b48c-284575cb80cc\") " pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:23 crc kubenswrapper[4742]: I0317 11:59:23.575994 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2344603c-7cde-40db-b48c-284575cb80cc-catalog-content\") pod \"certified-operators-frbfk\" (UID: \"2344603c-7cde-40db-b48c-284575cb80cc\") " pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:23 crc kubenswrapper[4742]: I0317 11:59:23.576048 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpk7c\" (UniqueName: \"kubernetes.io/projected/2344603c-7cde-40db-b48c-284575cb80cc-kube-api-access-xpk7c\") pod \"certified-operators-frbfk\" (UID: \"2344603c-7cde-40db-b48c-284575cb80cc\") " pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:23 crc kubenswrapper[4742]: I0317 11:59:23.576468 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2344603c-7cde-40db-b48c-284575cb80cc-utilities\") pod \"certified-operators-frbfk\" (UID: \"2344603c-7cde-40db-b48c-284575cb80cc\") " pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:23 crc kubenswrapper[4742]: I0317 11:59:23.576714 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2344603c-7cde-40db-b48c-284575cb80cc-catalog-content\") pod \"certified-operators-frbfk\" (UID: \"2344603c-7cde-40db-b48c-284575cb80cc\") " pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:23 crc kubenswrapper[4742]: I0317 11:59:23.594629 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpk7c\" (UniqueName: \"kubernetes.io/projected/2344603c-7cde-40db-b48c-284575cb80cc-kube-api-access-xpk7c\") pod \"certified-operators-frbfk\" (UID: \"2344603c-7cde-40db-b48c-284575cb80cc\") " pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:23 crc kubenswrapper[4742]: I0317 11:59:23.624583 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:34 crc kubenswrapper[4742]: E0317 11:59:34.635902 4742 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 17 11:59:34 crc kubenswrapper[4742]: E0317 11:59:34.636437 4742 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l9npq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(cbe323de-3d55-4905-8f28-29cea959ae35): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 17 11:59:34 crc kubenswrapper[4742]: E0317 11:59:34.637750 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="cbe323de-3d55-4905-8f28-29cea959ae35" Mar 17 11:59:35 crc kubenswrapper[4742]: I0317 11:59:35.077230 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l5gm8"] Mar 17 11:59:35 crc kubenswrapper[4742]: I0317 11:59:35.087695 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frbfk"] Mar 17 11:59:35 crc kubenswrapper[4742]: I0317 11:59:35.112166 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5gm8" event={"ID":"68213073-9623-47e5-88fe-bf017f130a6f","Type":"ContainerStarted","Data":"f69926bccdf3850a4d5fa9eb0208ea9b4e8dc9ee6aafe77c069be3f22b8637ca"} Mar 17 11:59:35 crc kubenswrapper[4742]: I0317 11:59:35.114355 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frbfk" event={"ID":"2344603c-7cde-40db-b48c-284575cb80cc","Type":"ContainerStarted","Data":"fc3c0b1d1b27d48fad588512002d71296bab20a61dc8b0236dcbf394b78e7368"} Mar 17 11:59:35 crc kubenswrapper[4742]: E0317 11:59:35.117169 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="cbe323de-3d55-4905-8f28-29cea959ae35" Mar 17 11:59:36 crc kubenswrapper[4742]: I0317 11:59:36.130500 4742 generic.go:334] "Generic (PLEG): container finished" podID="68213073-9623-47e5-88fe-bf017f130a6f" containerID="240a2c2b3635b122c37f938e58960b1b970126d847bdb6cbb19a4c657eb360cb" exitCode=0 Mar 17 11:59:36 crc kubenswrapper[4742]: I0317 11:59:36.131050 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5gm8" event={"ID":"68213073-9623-47e5-88fe-bf017f130a6f","Type":"ContainerDied","Data":"240a2c2b3635b122c37f938e58960b1b970126d847bdb6cbb19a4c657eb360cb"} Mar 17 11:59:36 crc kubenswrapper[4742]: I0317 11:59:36.138408 4742 generic.go:334] "Generic (PLEG): container finished" podID="2344603c-7cde-40db-b48c-284575cb80cc" containerID="7e29812106a8e04a7b587412fc726e76df34411e6a2c4219738a253ff48f1d67" exitCode=0 Mar 17 11:59:36 crc kubenswrapper[4742]: I0317 11:59:36.138786 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frbfk" event={"ID":"2344603c-7cde-40db-b48c-284575cb80cc","Type":"ContainerDied","Data":"7e29812106a8e04a7b587412fc726e76df34411e6a2c4219738a253ff48f1d67"} Mar 17 11:59:38 crc kubenswrapper[4742]: I0317 11:59:38.160929 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5gm8" event={"ID":"68213073-9623-47e5-88fe-bf017f130a6f","Type":"ContainerStarted","Data":"a272e7d3f75ad2166a0854da44244817dd5b1f16477e8fdf2aeef324a8c916ed"} Mar 17 11:59:38 crc kubenswrapper[4742]: I0317 11:59:38.163800 4742 generic.go:334] "Generic (PLEG): container finished" podID="2344603c-7cde-40db-b48c-284575cb80cc" containerID="2b61a55ec7bc9d5e00f9a8c05fe229da0ebbbff9514e65fd2d33766acbfca124" exitCode=0 Mar 17 11:59:38 crc kubenswrapper[4742]: I0317 11:59:38.163854 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frbfk" event={"ID":"2344603c-7cde-40db-b48c-284575cb80cc","Type":"ContainerDied","Data":"2b61a55ec7bc9d5e00f9a8c05fe229da0ebbbff9514e65fd2d33766acbfca124"} Mar 17 11:59:40 crc kubenswrapper[4742]: I0317 11:59:40.182309 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frbfk" event={"ID":"2344603c-7cde-40db-b48c-284575cb80cc","Type":"ContainerStarted","Data":"ae6bd6a2e38e5b9421f20c870ab4eb380a13b6ee1e356014c824d19e39ed0772"} Mar 17 11:59:40 crc kubenswrapper[4742]: I0317 11:59:40.207306 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-frbfk" podStartSLOduration=14.004573993 podStartE2EDuration="17.207290636s" podCreationTimestamp="2026-03-17 11:59:23 +0000 UTC" firstStartedPulling="2026-03-17 11:59:36.142331188 +0000 UTC m=+2879.268458986" lastFinishedPulling="2026-03-17 11:59:39.345047831 +0000 UTC m=+2882.471175629" observedRunningTime="2026-03-17 11:59:40.204095428 +0000 UTC m=+2883.330223176" watchObservedRunningTime="2026-03-17 11:59:40.207290636 +0000 UTC m=+2883.333418394" Mar 17 11:59:41 crc kubenswrapper[4742]: I0317 11:59:41.212540 4742 generic.go:334] "Generic (PLEG): container finished" podID="68213073-9623-47e5-88fe-bf017f130a6f" containerID="a272e7d3f75ad2166a0854da44244817dd5b1f16477e8fdf2aeef324a8c916ed" exitCode=0 Mar 17 11:59:41 crc kubenswrapper[4742]: I0317 11:59:41.213323 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5gm8" event={"ID":"68213073-9623-47e5-88fe-bf017f130a6f","Type":"ContainerDied","Data":"a272e7d3f75ad2166a0854da44244817dd5b1f16477e8fdf2aeef324a8c916ed"} Mar 17 11:59:42 crc kubenswrapper[4742]: I0317 11:59:42.225368 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5gm8" event={"ID":"68213073-9623-47e5-88fe-bf017f130a6f","Type":"ContainerStarted","Data":"7d0a1f093580f4597bf1bd0c7f5dd9afe190474728bb31d497773ea53e641906"} Mar 17 11:59:42 crc kubenswrapper[4742]: I0317 11:59:42.250502 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l5gm8" podStartSLOduration=15.692299497 podStartE2EDuration="21.250482521s" podCreationTimestamp="2026-03-17 11:59:21 +0000 UTC" firstStartedPulling="2026-03-17 11:59:36.134556944 +0000 UTC m=+2879.260684702" lastFinishedPulling="2026-03-17 11:59:41.692739958 +0000 UTC m=+2884.818867726" observedRunningTime="2026-03-17 11:59:42.245527734 +0000 UTC m=+2885.371655522" watchObservedRunningTime="2026-03-17 11:59:42.250482521 +0000 UTC m=+2885.376610299" Mar 17 11:59:43 crc kubenswrapper[4742]: I0317 11:59:43.625515 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:43 crc kubenswrapper[4742]: I0317 11:59:43.625961 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:43 crc kubenswrapper[4742]: I0317 11:59:43.679881 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:44 crc kubenswrapper[4742]: I0317 11:59:44.344953 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:44 crc kubenswrapper[4742]: I0317 11:59:44.410162 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frbfk"] Mar 17 11:59:46 crc kubenswrapper[4742]: I0317 11:59:46.281457 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-frbfk" podUID="2344603c-7cde-40db-b48c-284575cb80cc" containerName="registry-server" containerID="cri-o://ae6bd6a2e38e5b9421f20c870ab4eb380a13b6ee1e356014c824d19e39ed0772" gracePeriod=2 Mar 17 11:59:46 crc kubenswrapper[4742]: I0317 11:59:46.784158 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:46 crc kubenswrapper[4742]: I0317 11:59:46.896096 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2344603c-7cde-40db-b48c-284575cb80cc-catalog-content\") pod \"2344603c-7cde-40db-b48c-284575cb80cc\" (UID: \"2344603c-7cde-40db-b48c-284575cb80cc\") " Mar 17 11:59:46 crc kubenswrapper[4742]: I0317 11:59:46.920290 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpk7c\" (UniqueName: \"kubernetes.io/projected/2344603c-7cde-40db-b48c-284575cb80cc-kube-api-access-xpk7c\") pod \"2344603c-7cde-40db-b48c-284575cb80cc\" (UID: \"2344603c-7cde-40db-b48c-284575cb80cc\") " Mar 17 11:59:46 crc kubenswrapper[4742]: I0317 11:59:46.920370 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2344603c-7cde-40db-b48c-284575cb80cc-utilities\") pod \"2344603c-7cde-40db-b48c-284575cb80cc\" (UID: \"2344603c-7cde-40db-b48c-284575cb80cc\") " Mar 17 11:59:46 crc kubenswrapper[4742]: I0317 11:59:46.921279 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2344603c-7cde-40db-b48c-284575cb80cc-utilities" (OuterVolumeSpecName: "utilities") pod "2344603c-7cde-40db-b48c-284575cb80cc" (UID: "2344603c-7cde-40db-b48c-284575cb80cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:59:46 crc kubenswrapper[4742]: I0317 11:59:46.933078 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2344603c-7cde-40db-b48c-284575cb80cc-kube-api-access-xpk7c" (OuterVolumeSpecName: "kube-api-access-xpk7c") pod "2344603c-7cde-40db-b48c-284575cb80cc" (UID: "2344603c-7cde-40db-b48c-284575cb80cc"). InnerVolumeSpecName "kube-api-access-xpk7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:59:46 crc kubenswrapper[4742]: I0317 11:59:46.944941 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2344603c-7cde-40db-b48c-284575cb80cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2344603c-7cde-40db-b48c-284575cb80cc" (UID: "2344603c-7cde-40db-b48c-284575cb80cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.023389 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2344603c-7cde-40db-b48c-284575cb80cc-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.023458 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2344603c-7cde-40db-b48c-284575cb80cc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.023488 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpk7c\" (UniqueName: \"kubernetes.io/projected/2344603c-7cde-40db-b48c-284575cb80cc-kube-api-access-xpk7c\") on node \"crc\" DevicePath \"\"" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.135189 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.302857 4742 generic.go:334] "Generic (PLEG): container finished" podID="2344603c-7cde-40db-b48c-284575cb80cc" containerID="ae6bd6a2e38e5b9421f20c870ab4eb380a13b6ee1e356014c824d19e39ed0772" exitCode=0 Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.302954 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frbfk" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.302946 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frbfk" event={"ID":"2344603c-7cde-40db-b48c-284575cb80cc","Type":"ContainerDied","Data":"ae6bd6a2e38e5b9421f20c870ab4eb380a13b6ee1e356014c824d19e39ed0772"} Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.303487 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frbfk" event={"ID":"2344603c-7cde-40db-b48c-284575cb80cc","Type":"ContainerDied","Data":"fc3c0b1d1b27d48fad588512002d71296bab20a61dc8b0236dcbf394b78e7368"} Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.303507 4742 scope.go:117] "RemoveContainer" containerID="ae6bd6a2e38e5b9421f20c870ab4eb380a13b6ee1e356014c824d19e39ed0772" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.342920 4742 scope.go:117] "RemoveContainer" containerID="2b61a55ec7bc9d5e00f9a8c05fe229da0ebbbff9514e65fd2d33766acbfca124" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.361118 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frbfk"] Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.371307 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-frbfk"] Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.409118 4742 scope.go:117] "RemoveContainer" containerID="7e29812106a8e04a7b587412fc726e76df34411e6a2c4219738a253ff48f1d67" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.494877 4742 scope.go:117] "RemoveContainer" containerID="ae6bd6a2e38e5b9421f20c870ab4eb380a13b6ee1e356014c824d19e39ed0772" Mar 17 11:59:47 crc kubenswrapper[4742]: E0317 11:59:47.495404 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae6bd6a2e38e5b9421f20c870ab4eb380a13b6ee1e356014c824d19e39ed0772\": container with ID starting with ae6bd6a2e38e5b9421f20c870ab4eb380a13b6ee1e356014c824d19e39ed0772 not found: ID does not exist" containerID="ae6bd6a2e38e5b9421f20c870ab4eb380a13b6ee1e356014c824d19e39ed0772" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.495436 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae6bd6a2e38e5b9421f20c870ab4eb380a13b6ee1e356014c824d19e39ed0772"} err="failed to get container status \"ae6bd6a2e38e5b9421f20c870ab4eb380a13b6ee1e356014c824d19e39ed0772\": rpc error: code = NotFound desc = could not find container \"ae6bd6a2e38e5b9421f20c870ab4eb380a13b6ee1e356014c824d19e39ed0772\": container with ID starting with ae6bd6a2e38e5b9421f20c870ab4eb380a13b6ee1e356014c824d19e39ed0772 not found: ID does not exist" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.495455 4742 scope.go:117] "RemoveContainer" containerID="2b61a55ec7bc9d5e00f9a8c05fe229da0ebbbff9514e65fd2d33766acbfca124" Mar 17 11:59:47 crc kubenswrapper[4742]: E0317 11:59:47.497303 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b61a55ec7bc9d5e00f9a8c05fe229da0ebbbff9514e65fd2d33766acbfca124\": container with ID starting with 2b61a55ec7bc9d5e00f9a8c05fe229da0ebbbff9514e65fd2d33766acbfca124 not found: ID does not exist" containerID="2b61a55ec7bc9d5e00f9a8c05fe229da0ebbbff9514e65fd2d33766acbfca124" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.497337 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b61a55ec7bc9d5e00f9a8c05fe229da0ebbbff9514e65fd2d33766acbfca124"} err="failed to get container status \"2b61a55ec7bc9d5e00f9a8c05fe229da0ebbbff9514e65fd2d33766acbfca124\": rpc error: code = NotFound desc = could not find container \"2b61a55ec7bc9d5e00f9a8c05fe229da0ebbbff9514e65fd2d33766acbfca124\": container with ID starting with 2b61a55ec7bc9d5e00f9a8c05fe229da0ebbbff9514e65fd2d33766acbfca124 not found: ID does not exist" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.497354 4742 scope.go:117] "RemoveContainer" containerID="7e29812106a8e04a7b587412fc726e76df34411e6a2c4219738a253ff48f1d67" Mar 17 11:59:47 crc kubenswrapper[4742]: E0317 11:59:47.498137 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e29812106a8e04a7b587412fc726e76df34411e6a2c4219738a253ff48f1d67\": container with ID starting with 7e29812106a8e04a7b587412fc726e76df34411e6a2c4219738a253ff48f1d67 not found: ID does not exist" containerID="7e29812106a8e04a7b587412fc726e76df34411e6a2c4219738a253ff48f1d67" Mar 17 11:59:47 crc kubenswrapper[4742]: I0317 11:59:47.498202 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e29812106a8e04a7b587412fc726e76df34411e6a2c4219738a253ff48f1d67"} err="failed to get container status \"7e29812106a8e04a7b587412fc726e76df34411e6a2c4219738a253ff48f1d67\": rpc error: code = NotFound desc = could not find container \"7e29812106a8e04a7b587412fc726e76df34411e6a2c4219738a253ff48f1d67\": container with ID starting with 7e29812106a8e04a7b587412fc726e76df34411e6a2c4219738a253ff48f1d67 not found: ID does not exist" Mar 17 11:59:48 crc kubenswrapper[4742]: I0317 11:59:48.043761 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 11:59:48 crc kubenswrapper[4742]: I0317 11:59:48.044190 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 11:59:48 crc kubenswrapper[4742]: I0317 11:59:48.677209 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2344603c-7cde-40db-b48c-284575cb80cc" path="/var/lib/kubelet/pods/2344603c-7cde-40db-b48c-284575cb80cc/volumes" Mar 17 11:59:49 crc kubenswrapper[4742]: I0317 11:59:49.333554 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"cbe323de-3d55-4905-8f28-29cea959ae35","Type":"ContainerStarted","Data":"4b4bd5b2fc127bd20641300159a13b0689286cc862afe3593d73593d72e88aa3"} Mar 17 11:59:49 crc kubenswrapper[4742]: I0317 11:59:49.370956 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.479513476 podStartE2EDuration="47.370938008s" podCreationTimestamp="2026-03-17 11:59:02 +0000 UTC" firstStartedPulling="2026-03-17 11:59:04.240376255 +0000 UTC m=+2847.366504013" lastFinishedPulling="2026-03-17 11:59:47.131800777 +0000 UTC m=+2890.257928545" observedRunningTime="2026-03-17 11:59:49.366114425 +0000 UTC m=+2892.492242193" watchObservedRunningTime="2026-03-17 11:59:49.370938008 +0000 UTC m=+2892.497065776" Mar 17 11:59:51 crc kubenswrapper[4742]: I0317 11:59:51.637720 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:51 crc kubenswrapper[4742]: I0317 11:59:51.638201 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:51 crc kubenswrapper[4742]: I0317 11:59:51.698081 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:52 crc kubenswrapper[4742]: I0317 11:59:52.422253 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:52 crc kubenswrapper[4742]: I0317 11:59:52.502565 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l5gm8"] Mar 17 11:59:54 crc kubenswrapper[4742]: I0317 11:59:54.381889 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l5gm8" podUID="68213073-9623-47e5-88fe-bf017f130a6f" containerName="registry-server" containerID="cri-o://7d0a1f093580f4597bf1bd0c7f5dd9afe190474728bb31d497773ea53e641906" gracePeriod=2 Mar 17 11:59:54 crc kubenswrapper[4742]: I0317 11:59:54.889690 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:54 crc kubenswrapper[4742]: I0317 11:59:54.973015 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68213073-9623-47e5-88fe-bf017f130a6f-catalog-content\") pod \"68213073-9623-47e5-88fe-bf017f130a6f\" (UID: \"68213073-9623-47e5-88fe-bf017f130a6f\") " Mar 17 11:59:54 crc kubenswrapper[4742]: I0317 11:59:54.973357 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x8ms\" (UniqueName: \"kubernetes.io/projected/68213073-9623-47e5-88fe-bf017f130a6f-kube-api-access-9x8ms\") pod \"68213073-9623-47e5-88fe-bf017f130a6f\" (UID: \"68213073-9623-47e5-88fe-bf017f130a6f\") " Mar 17 11:59:54 crc kubenswrapper[4742]: I0317 11:59:54.973400 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68213073-9623-47e5-88fe-bf017f130a6f-utilities\") pod \"68213073-9623-47e5-88fe-bf017f130a6f\" (UID: \"68213073-9623-47e5-88fe-bf017f130a6f\") " Mar 17 11:59:54 crc kubenswrapper[4742]: I0317 11:59:54.974194 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68213073-9623-47e5-88fe-bf017f130a6f-utilities" (OuterVolumeSpecName: "utilities") pod "68213073-9623-47e5-88fe-bf017f130a6f" (UID: "68213073-9623-47e5-88fe-bf017f130a6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:59:54 crc kubenswrapper[4742]: I0317 11:59:54.980560 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68213073-9623-47e5-88fe-bf017f130a6f-kube-api-access-9x8ms" (OuterVolumeSpecName: "kube-api-access-9x8ms") pod "68213073-9623-47e5-88fe-bf017f130a6f" (UID: "68213073-9623-47e5-88fe-bf017f130a6f"). InnerVolumeSpecName "kube-api-access-9x8ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.075883 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x8ms\" (UniqueName: \"kubernetes.io/projected/68213073-9623-47e5-88fe-bf017f130a6f-kube-api-access-9x8ms\") on node \"crc\" DevicePath \"\"" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.075928 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68213073-9623-47e5-88fe-bf017f130a6f-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.117000 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68213073-9623-47e5-88fe-bf017f130a6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68213073-9623-47e5-88fe-bf017f130a6f" (UID: "68213073-9623-47e5-88fe-bf017f130a6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.177429 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68213073-9623-47e5-88fe-bf017f130a6f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.397339 4742 generic.go:334] "Generic (PLEG): container finished" podID="68213073-9623-47e5-88fe-bf017f130a6f" containerID="7d0a1f093580f4597bf1bd0c7f5dd9afe190474728bb31d497773ea53e641906" exitCode=0 Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.397388 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5gm8" event={"ID":"68213073-9623-47e5-88fe-bf017f130a6f","Type":"ContainerDied","Data":"7d0a1f093580f4597bf1bd0c7f5dd9afe190474728bb31d497773ea53e641906"} Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.397418 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5gm8" event={"ID":"68213073-9623-47e5-88fe-bf017f130a6f","Type":"ContainerDied","Data":"f69926bccdf3850a4d5fa9eb0208ea9b4e8dc9ee6aafe77c069be3f22b8637ca"} Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.397436 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5gm8" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.397437 4742 scope.go:117] "RemoveContainer" containerID="7d0a1f093580f4597bf1bd0c7f5dd9afe190474728bb31d497773ea53e641906" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.436240 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l5gm8"] Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.443658 4742 scope.go:117] "RemoveContainer" containerID="a272e7d3f75ad2166a0854da44244817dd5b1f16477e8fdf2aeef324a8c916ed" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.449829 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l5gm8"] Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.484568 4742 scope.go:117] "RemoveContainer" containerID="240a2c2b3635b122c37f938e58960b1b970126d847bdb6cbb19a4c657eb360cb" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.532849 4742 scope.go:117] "RemoveContainer" containerID="7d0a1f093580f4597bf1bd0c7f5dd9afe190474728bb31d497773ea53e641906" Mar 17 11:59:55 crc kubenswrapper[4742]: E0317 11:59:55.533378 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d0a1f093580f4597bf1bd0c7f5dd9afe190474728bb31d497773ea53e641906\": container with ID starting with 7d0a1f093580f4597bf1bd0c7f5dd9afe190474728bb31d497773ea53e641906 not found: ID does not exist" containerID="7d0a1f093580f4597bf1bd0c7f5dd9afe190474728bb31d497773ea53e641906" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.533416 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d0a1f093580f4597bf1bd0c7f5dd9afe190474728bb31d497773ea53e641906"} err="failed to get container status \"7d0a1f093580f4597bf1bd0c7f5dd9afe190474728bb31d497773ea53e641906\": rpc error: code = NotFound desc = could not find container \"7d0a1f093580f4597bf1bd0c7f5dd9afe190474728bb31d497773ea53e641906\": container with ID starting with 7d0a1f093580f4597bf1bd0c7f5dd9afe190474728bb31d497773ea53e641906 not found: ID does not exist" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.533442 4742 scope.go:117] "RemoveContainer" containerID="a272e7d3f75ad2166a0854da44244817dd5b1f16477e8fdf2aeef324a8c916ed" Mar 17 11:59:55 crc kubenswrapper[4742]: E0317 11:59:55.533895 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a272e7d3f75ad2166a0854da44244817dd5b1f16477e8fdf2aeef324a8c916ed\": container with ID starting with a272e7d3f75ad2166a0854da44244817dd5b1f16477e8fdf2aeef324a8c916ed not found: ID does not exist" containerID="a272e7d3f75ad2166a0854da44244817dd5b1f16477e8fdf2aeef324a8c916ed" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.533981 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a272e7d3f75ad2166a0854da44244817dd5b1f16477e8fdf2aeef324a8c916ed"} err="failed to get container status \"a272e7d3f75ad2166a0854da44244817dd5b1f16477e8fdf2aeef324a8c916ed\": rpc error: code = NotFound desc = could not find container \"a272e7d3f75ad2166a0854da44244817dd5b1f16477e8fdf2aeef324a8c916ed\": container with ID starting with a272e7d3f75ad2166a0854da44244817dd5b1f16477e8fdf2aeef324a8c916ed not found: ID does not exist" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.534021 4742 scope.go:117] "RemoveContainer" containerID="240a2c2b3635b122c37f938e58960b1b970126d847bdb6cbb19a4c657eb360cb" Mar 17 11:59:55 crc kubenswrapper[4742]: E0317 11:59:55.534354 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"240a2c2b3635b122c37f938e58960b1b970126d847bdb6cbb19a4c657eb360cb\": container with ID starting with 240a2c2b3635b122c37f938e58960b1b970126d847bdb6cbb19a4c657eb360cb not found: ID does not exist" containerID="240a2c2b3635b122c37f938e58960b1b970126d847bdb6cbb19a4c657eb360cb" Mar 17 11:59:55 crc kubenswrapper[4742]: I0317 11:59:55.534387 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"240a2c2b3635b122c37f938e58960b1b970126d847bdb6cbb19a4c657eb360cb"} err="failed to get container status \"240a2c2b3635b122c37f938e58960b1b970126d847bdb6cbb19a4c657eb360cb\": rpc error: code = NotFound desc = could not find container \"240a2c2b3635b122c37f938e58960b1b970126d847bdb6cbb19a4c657eb360cb\": container with ID starting with 240a2c2b3635b122c37f938e58960b1b970126d847bdb6cbb19a4c657eb360cb not found: ID does not exist" Mar 17 11:59:56 crc kubenswrapper[4742]: I0317 11:59:56.676248 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68213073-9623-47e5-88fe-bf017f130a6f" path="/var/lib/kubelet/pods/68213073-9623-47e5-88fe-bf017f130a6f/volumes" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.150898 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562480-8l4gb"] Mar 17 12:00:00 crc kubenswrapper[4742]: E0317 12:00:00.151711 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2344603c-7cde-40db-b48c-284575cb80cc" containerName="registry-server" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.151723 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="2344603c-7cde-40db-b48c-284575cb80cc" containerName="registry-server" Mar 17 12:00:00 crc kubenswrapper[4742]: E0317 12:00:00.151739 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68213073-9623-47e5-88fe-bf017f130a6f" containerName="extract-content" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.151745 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="68213073-9623-47e5-88fe-bf017f130a6f" containerName="extract-content" Mar 17 12:00:00 crc kubenswrapper[4742]: E0317 12:00:00.151754 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2344603c-7cde-40db-b48c-284575cb80cc" containerName="extract-utilities" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.151760 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="2344603c-7cde-40db-b48c-284575cb80cc" containerName="extract-utilities" Mar 17 12:00:00 crc kubenswrapper[4742]: E0317 12:00:00.151778 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68213073-9623-47e5-88fe-bf017f130a6f" containerName="extract-utilities" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.151784 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="68213073-9623-47e5-88fe-bf017f130a6f" containerName="extract-utilities" Mar 17 12:00:00 crc kubenswrapper[4742]: E0317 12:00:00.151793 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2344603c-7cde-40db-b48c-284575cb80cc" containerName="extract-content" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.151802 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="2344603c-7cde-40db-b48c-284575cb80cc" containerName="extract-content" Mar 17 12:00:00 crc kubenswrapper[4742]: E0317 12:00:00.151813 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68213073-9623-47e5-88fe-bf017f130a6f" containerName="registry-server" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.151818 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="68213073-9623-47e5-88fe-bf017f130a6f" containerName="registry-server" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.151986 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="68213073-9623-47e5-88fe-bf017f130a6f" containerName="registry-server" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.152024 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="2344603c-7cde-40db-b48c-284575cb80cc" containerName="registry-server" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.152573 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562480-8l4gb" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.154545 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.155022 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.155193 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.163787 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562480-8l4gb"] Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.193304 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc85b\" (UniqueName: \"kubernetes.io/projected/ff23f76e-4787-4741-a13b-ada68fec94b9-kube-api-access-dc85b\") pod \"auto-csr-approver-29562480-8l4gb\" (UID: \"ff23f76e-4787-4741-a13b-ada68fec94b9\") " pod="openshift-infra/auto-csr-approver-29562480-8l4gb" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.248922 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp"] Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.250174 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.252475 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.253083 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.275073 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp"] Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.294331 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc85b\" (UniqueName: \"kubernetes.io/projected/ff23f76e-4787-4741-a13b-ada68fec94b9-kube-api-access-dc85b\") pod \"auto-csr-approver-29562480-8l4gb\" (UID: \"ff23f76e-4787-4741-a13b-ada68fec94b9\") " pod="openshift-infra/auto-csr-approver-29562480-8l4gb" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.294396 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/994765fd-f705-4e41-ab78-cd6e937c1627-config-volume\") pod \"collect-profiles-29562480-s6drp\" (UID: \"994765fd-f705-4e41-ab78-cd6e937c1627\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.294534 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59xgv\" (UniqueName: \"kubernetes.io/projected/994765fd-f705-4e41-ab78-cd6e937c1627-kube-api-access-59xgv\") pod \"collect-profiles-29562480-s6drp\" (UID: \"994765fd-f705-4e41-ab78-cd6e937c1627\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.294592 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/994765fd-f705-4e41-ab78-cd6e937c1627-secret-volume\") pod \"collect-profiles-29562480-s6drp\" (UID: \"994765fd-f705-4e41-ab78-cd6e937c1627\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.318784 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc85b\" (UniqueName: \"kubernetes.io/projected/ff23f76e-4787-4741-a13b-ada68fec94b9-kube-api-access-dc85b\") pod \"auto-csr-approver-29562480-8l4gb\" (UID: \"ff23f76e-4787-4741-a13b-ada68fec94b9\") " pod="openshift-infra/auto-csr-approver-29562480-8l4gb" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.397522 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59xgv\" (UniqueName: \"kubernetes.io/projected/994765fd-f705-4e41-ab78-cd6e937c1627-kube-api-access-59xgv\") pod \"collect-profiles-29562480-s6drp\" (UID: \"994765fd-f705-4e41-ab78-cd6e937c1627\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.397626 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/994765fd-f705-4e41-ab78-cd6e937c1627-secret-volume\") pod \"collect-profiles-29562480-s6drp\" (UID: \"994765fd-f705-4e41-ab78-cd6e937c1627\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.398385 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/994765fd-f705-4e41-ab78-cd6e937c1627-config-volume\") pod \"collect-profiles-29562480-s6drp\" (UID: \"994765fd-f705-4e41-ab78-cd6e937c1627\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.399395 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/994765fd-f705-4e41-ab78-cd6e937c1627-config-volume\") pod \"collect-profiles-29562480-s6drp\" (UID: \"994765fd-f705-4e41-ab78-cd6e937c1627\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.406325 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/994765fd-f705-4e41-ab78-cd6e937c1627-secret-volume\") pod \"collect-profiles-29562480-s6drp\" (UID: \"994765fd-f705-4e41-ab78-cd6e937c1627\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.413268 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59xgv\" (UniqueName: \"kubernetes.io/projected/994765fd-f705-4e41-ab78-cd6e937c1627-kube-api-access-59xgv\") pod \"collect-profiles-29562480-s6drp\" (UID: \"994765fd-f705-4e41-ab78-cd6e937c1627\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.470494 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562480-8l4gb" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.570208 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" Mar 17 12:00:00 crc kubenswrapper[4742]: I0317 12:00:00.962336 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562480-8l4gb"] Mar 17 12:00:00 crc kubenswrapper[4742]: W0317 12:00:00.963455 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff23f76e_4787_4741_a13b_ada68fec94b9.slice/crio-38dc9c96e86ea0d265495b82377c4654a531c961115d70afe3fef92d0989c0d1 WatchSource:0}: Error finding container 38dc9c96e86ea0d265495b82377c4654a531c961115d70afe3fef92d0989c0d1: Status 404 returned error can't find the container with id 38dc9c96e86ea0d265495b82377c4654a531c961115d70afe3fef92d0989c0d1 Mar 17 12:00:01 crc kubenswrapper[4742]: I0317 12:00:01.096347 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp"] Mar 17 12:00:01 crc kubenswrapper[4742]: W0317 12:00:01.096781 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod994765fd_f705_4e41_ab78_cd6e937c1627.slice/crio-f6b0ae60bc5bd7b5cdd13a655b7d0505f785b2bc72c62f7bd6cf6fb7bb833912 WatchSource:0}: Error finding container f6b0ae60bc5bd7b5cdd13a655b7d0505f785b2bc72c62f7bd6cf6fb7bb833912: Status 404 returned error can't find the container with id f6b0ae60bc5bd7b5cdd13a655b7d0505f785b2bc72c62f7bd6cf6fb7bb833912 Mar 17 12:00:01 crc kubenswrapper[4742]: I0317 12:00:01.450689 4742 generic.go:334] "Generic (PLEG): container finished" podID="994765fd-f705-4e41-ab78-cd6e937c1627" containerID="b428ae4d3fc8401f10299a861bae667f51c60c96124f03c79359209e8491c0bb" exitCode=0 Mar 17 12:00:01 crc kubenswrapper[4742]: I0317 12:00:01.450826 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" event={"ID":"994765fd-f705-4e41-ab78-cd6e937c1627","Type":"ContainerDied","Data":"b428ae4d3fc8401f10299a861bae667f51c60c96124f03c79359209e8491c0bb"} Mar 17 12:00:01 crc kubenswrapper[4742]: I0317 12:00:01.452007 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" event={"ID":"994765fd-f705-4e41-ab78-cd6e937c1627","Type":"ContainerStarted","Data":"f6b0ae60bc5bd7b5cdd13a655b7d0505f785b2bc72c62f7bd6cf6fb7bb833912"} Mar 17 12:00:01 crc kubenswrapper[4742]: I0317 12:00:01.453180 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562480-8l4gb" event={"ID":"ff23f76e-4787-4741-a13b-ada68fec94b9","Type":"ContainerStarted","Data":"38dc9c96e86ea0d265495b82377c4654a531c961115d70afe3fef92d0989c0d1"} Mar 17 12:00:02 crc kubenswrapper[4742]: I0317 12:00:02.873291 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.061490 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59xgv\" (UniqueName: \"kubernetes.io/projected/994765fd-f705-4e41-ab78-cd6e937c1627-kube-api-access-59xgv\") pod \"994765fd-f705-4e41-ab78-cd6e937c1627\" (UID: \"994765fd-f705-4e41-ab78-cd6e937c1627\") " Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.061540 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/994765fd-f705-4e41-ab78-cd6e937c1627-secret-volume\") pod \"994765fd-f705-4e41-ab78-cd6e937c1627\" (UID: \"994765fd-f705-4e41-ab78-cd6e937c1627\") " Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.061781 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/994765fd-f705-4e41-ab78-cd6e937c1627-config-volume\") pod \"994765fd-f705-4e41-ab78-cd6e937c1627\" (UID: \"994765fd-f705-4e41-ab78-cd6e937c1627\") " Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.062389 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994765fd-f705-4e41-ab78-cd6e937c1627-config-volume" (OuterVolumeSpecName: "config-volume") pod "994765fd-f705-4e41-ab78-cd6e937c1627" (UID: "994765fd-f705-4e41-ab78-cd6e937c1627"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.073236 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/994765fd-f705-4e41-ab78-cd6e937c1627-kube-api-access-59xgv" (OuterVolumeSpecName: "kube-api-access-59xgv") pod "994765fd-f705-4e41-ab78-cd6e937c1627" (UID: "994765fd-f705-4e41-ab78-cd6e937c1627"). InnerVolumeSpecName "kube-api-access-59xgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.074118 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994765fd-f705-4e41-ab78-cd6e937c1627-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "994765fd-f705-4e41-ab78-cd6e937c1627" (UID: "994765fd-f705-4e41-ab78-cd6e937c1627"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.163595 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59xgv\" (UniqueName: \"kubernetes.io/projected/994765fd-f705-4e41-ab78-cd6e937c1627-kube-api-access-59xgv\") on node \"crc\" DevicePath \"\"" Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.163627 4742 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/994765fd-f705-4e41-ab78-cd6e937c1627-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.163638 4742 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/994765fd-f705-4e41-ab78-cd6e937c1627-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.479707 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" event={"ID":"994765fd-f705-4e41-ab78-cd6e937c1627","Type":"ContainerDied","Data":"f6b0ae60bc5bd7b5cdd13a655b7d0505f785b2bc72c62f7bd6cf6fb7bb833912"} Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.479766 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6b0ae60bc5bd7b5cdd13a655b7d0505f785b2bc72c62f7bd6cf6fb7bb833912" Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.479786 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562480-s6drp" Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.956312 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr"] Mar 17 12:00:03 crc kubenswrapper[4742]: I0317 12:00:03.965651 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562435-hmhmr"] Mar 17 12:00:04 crc kubenswrapper[4742]: I0317 12:00:04.683215 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75629956-e407-4638-90cd-fd2f907bb0fb" path="/var/lib/kubelet/pods/75629956-e407-4638-90cd-fd2f907bb0fb/volumes" Mar 17 12:00:05 crc kubenswrapper[4742]: I0317 12:00:05.499707 4742 generic.go:334] "Generic (PLEG): container finished" podID="ff23f76e-4787-4741-a13b-ada68fec94b9" containerID="5411daf7000da77b741d2d85d3477e8df997cb414d8d420141d57f72fb9da9fe" exitCode=0 Mar 17 12:00:05 crc kubenswrapper[4742]: I0317 12:00:05.499887 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562480-8l4gb" event={"ID":"ff23f76e-4787-4741-a13b-ada68fec94b9","Type":"ContainerDied","Data":"5411daf7000da77b741d2d85d3477e8df997cb414d8d420141d57f72fb9da9fe"} Mar 17 12:00:06 crc kubenswrapper[4742]: I0317 12:00:06.882133 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562480-8l4gb" Mar 17 12:00:07 crc kubenswrapper[4742]: I0317 12:00:07.069144 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc85b\" (UniqueName: \"kubernetes.io/projected/ff23f76e-4787-4741-a13b-ada68fec94b9-kube-api-access-dc85b\") pod \"ff23f76e-4787-4741-a13b-ada68fec94b9\" (UID: \"ff23f76e-4787-4741-a13b-ada68fec94b9\") " Mar 17 12:00:07 crc kubenswrapper[4742]: I0317 12:00:07.078588 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff23f76e-4787-4741-a13b-ada68fec94b9-kube-api-access-dc85b" (OuterVolumeSpecName: "kube-api-access-dc85b") pod "ff23f76e-4787-4741-a13b-ada68fec94b9" (UID: "ff23f76e-4787-4741-a13b-ada68fec94b9"). InnerVolumeSpecName "kube-api-access-dc85b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:00:07 crc kubenswrapper[4742]: I0317 12:00:07.171402 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc85b\" (UniqueName: \"kubernetes.io/projected/ff23f76e-4787-4741-a13b-ada68fec94b9-kube-api-access-dc85b\") on node \"crc\" DevicePath \"\"" Mar 17 12:00:07 crc kubenswrapper[4742]: I0317 12:00:07.526628 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562480-8l4gb" event={"ID":"ff23f76e-4787-4741-a13b-ada68fec94b9","Type":"ContainerDied","Data":"38dc9c96e86ea0d265495b82377c4654a531c961115d70afe3fef92d0989c0d1"} Mar 17 12:00:07 crc kubenswrapper[4742]: I0317 12:00:07.526681 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38dc9c96e86ea0d265495b82377c4654a531c961115d70afe3fef92d0989c0d1" Mar 17 12:00:07 crc kubenswrapper[4742]: I0317 12:00:07.526707 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562480-8l4gb" Mar 17 12:00:07 crc kubenswrapper[4742]: I0317 12:00:07.947553 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562474-k9hnw"] Mar 17 12:00:07 crc kubenswrapper[4742]: I0317 12:00:07.957021 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562474-k9hnw"] Mar 17 12:00:08 crc kubenswrapper[4742]: I0317 12:00:08.713144 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9601ffdf-5d67-45e7-88da-12672c58e00e" path="/var/lib/kubelet/pods/9601ffdf-5d67-45e7-88da-12672c58e00e/volumes" Mar 17 12:00:18 crc kubenswrapper[4742]: I0317 12:00:18.044307 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:00:18 crc kubenswrapper[4742]: I0317 12:00:18.044768 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:00:48 crc kubenswrapper[4742]: I0317 12:00:48.044255 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:00:48 crc kubenswrapper[4742]: I0317 12:00:48.044885 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:00:48 crc kubenswrapper[4742]: I0317 12:00:48.044970 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 12:00:48 crc kubenswrapper[4742]: I0317 12:00:48.045959 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 12:00:48 crc kubenswrapper[4742]: I0317 12:00:48.046041 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" gracePeriod=600 Mar 17 12:00:48 crc kubenswrapper[4742]: E0317 12:00:48.172613 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:00:48 crc kubenswrapper[4742]: I0317 12:00:48.950798 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" exitCode=0 Mar 17 12:00:48 crc kubenswrapper[4742]: I0317 12:00:48.950842 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50"} Mar 17 12:00:48 crc kubenswrapper[4742]: I0317 12:00:48.950885 4742 scope.go:117] "RemoveContainer" containerID="1750f423eed5ff73f33cbeaf0c7b4d19bec40d4ee6133f2c8141f4cdf4bd6cdf" Mar 17 12:00:48 crc kubenswrapper[4742]: I0317 12:00:48.951850 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:00:48 crc kubenswrapper[4742]: E0317 12:00:48.952359 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:00:56 crc kubenswrapper[4742]: I0317 12:00:56.443124 4742 scope.go:117] "RemoveContainer" containerID="8f7a71723f081cfb9b32621aa20813e40d3a14bb63ad901b7e6005a37bb634b1" Mar 17 12:00:56 crc kubenswrapper[4742]: I0317 12:00:56.502486 4742 scope.go:117] "RemoveContainer" containerID="300d665f9fa4f8318b1145926cca19ef60266854c5ffcc0a3bc845995ee1c214" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.167420 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29562481-pwxss"] Mar 17 12:01:00 crc kubenswrapper[4742]: E0317 12:01:00.168418 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994765fd-f705-4e41-ab78-cd6e937c1627" containerName="collect-profiles" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.168435 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="994765fd-f705-4e41-ab78-cd6e937c1627" containerName="collect-profiles" Mar 17 12:01:00 crc kubenswrapper[4742]: E0317 12:01:00.168473 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff23f76e-4787-4741-a13b-ada68fec94b9" containerName="oc" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.168482 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff23f76e-4787-4741-a13b-ada68fec94b9" containerName="oc" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.168689 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff23f76e-4787-4741-a13b-ada68fec94b9" containerName="oc" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.168730 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="994765fd-f705-4e41-ab78-cd6e937c1627" containerName="collect-profiles" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.169505 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.177662 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29562481-pwxss"] Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.271214 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52x9h\" (UniqueName: \"kubernetes.io/projected/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-kube-api-access-52x9h\") pod \"keystone-cron-29562481-pwxss\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.271359 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-combined-ca-bundle\") pod \"keystone-cron-29562481-pwxss\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.271478 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-config-data\") pod \"keystone-cron-29562481-pwxss\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.271600 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-fernet-keys\") pod \"keystone-cron-29562481-pwxss\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.373171 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-fernet-keys\") pod \"keystone-cron-29562481-pwxss\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.373275 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52x9h\" (UniqueName: \"kubernetes.io/projected/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-kube-api-access-52x9h\") pod \"keystone-cron-29562481-pwxss\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.373317 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-combined-ca-bundle\") pod \"keystone-cron-29562481-pwxss\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.373352 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-config-data\") pod \"keystone-cron-29562481-pwxss\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.388312 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-config-data\") pod \"keystone-cron-29562481-pwxss\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.388410 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-combined-ca-bundle\") pod \"keystone-cron-29562481-pwxss\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.403979 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-fernet-keys\") pod \"keystone-cron-29562481-pwxss\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.426964 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52x9h\" (UniqueName: \"kubernetes.io/projected/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-kube-api-access-52x9h\") pod \"keystone-cron-29562481-pwxss\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.501594 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:00 crc kubenswrapper[4742]: I0317 12:01:00.852459 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29562481-pwxss"] Mar 17 12:01:01 crc kubenswrapper[4742]: I0317 12:01:01.103666 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29562481-pwxss" event={"ID":"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5","Type":"ContainerStarted","Data":"c5599964d15deca607dadf7f5612224230b7b90d5a33a356cd4e537a2ab760d4"} Mar 17 12:01:01 crc kubenswrapper[4742]: I0317 12:01:01.103726 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29562481-pwxss" event={"ID":"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5","Type":"ContainerStarted","Data":"fe0ba4558088b1ceeafc819a85fbb6a0b7052c2418dd67d62284a1d3dd13bd58"} Mar 17 12:01:01 crc kubenswrapper[4742]: I0317 12:01:01.128654 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29562481-pwxss" podStartSLOduration=1.128625715 podStartE2EDuration="1.128625715s" podCreationTimestamp="2026-03-17 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 12:01:01.119516608 +0000 UTC m=+2964.245644386" watchObservedRunningTime="2026-03-17 12:01:01.128625715 +0000 UTC m=+2964.254753463" Mar 17 12:01:01 crc kubenswrapper[4742]: I0317 12:01:01.663498 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:01:01 crc kubenswrapper[4742]: E0317 12:01:01.664256 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:01:03 crc kubenswrapper[4742]: I0317 12:01:03.125609 4742 generic.go:334] "Generic (PLEG): container finished" podID="7cfb9cd7-2718-4547-a238-e62cfa4f3cb5" containerID="c5599964d15deca607dadf7f5612224230b7b90d5a33a356cd4e537a2ab760d4" exitCode=0 Mar 17 12:01:03 crc kubenswrapper[4742]: I0317 12:01:03.125678 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29562481-pwxss" event={"ID":"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5","Type":"ContainerDied","Data":"c5599964d15deca607dadf7f5612224230b7b90d5a33a356cd4e537a2ab760d4"} Mar 17 12:01:04 crc kubenswrapper[4742]: I0317 12:01:04.487746 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:04 crc kubenswrapper[4742]: I0317 12:01:04.571488 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-combined-ca-bundle\") pod \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " Mar 17 12:01:04 crc kubenswrapper[4742]: I0317 12:01:04.571567 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-fernet-keys\") pod \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " Mar 17 12:01:04 crc kubenswrapper[4742]: I0317 12:01:04.571776 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52x9h\" (UniqueName: \"kubernetes.io/projected/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-kube-api-access-52x9h\") pod \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " Mar 17 12:01:04 crc kubenswrapper[4742]: I0317 12:01:04.571828 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-config-data\") pod \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\" (UID: \"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5\") " Mar 17 12:01:04 crc kubenswrapper[4742]: I0317 12:01:04.578808 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-kube-api-access-52x9h" (OuterVolumeSpecName: "kube-api-access-52x9h") pod "7cfb9cd7-2718-4547-a238-e62cfa4f3cb5" (UID: "7cfb9cd7-2718-4547-a238-e62cfa4f3cb5"). InnerVolumeSpecName "kube-api-access-52x9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:01:04 crc kubenswrapper[4742]: I0317 12:01:04.591189 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7cfb9cd7-2718-4547-a238-e62cfa4f3cb5" (UID: "7cfb9cd7-2718-4547-a238-e62cfa4f3cb5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 12:01:04 crc kubenswrapper[4742]: I0317 12:01:04.607438 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cfb9cd7-2718-4547-a238-e62cfa4f3cb5" (UID: "7cfb9cd7-2718-4547-a238-e62cfa4f3cb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 12:01:04 crc kubenswrapper[4742]: I0317 12:01:04.663692 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-config-data" (OuterVolumeSpecName: "config-data") pod "7cfb9cd7-2718-4547-a238-e62cfa4f3cb5" (UID: "7cfb9cd7-2718-4547-a238-e62cfa4f3cb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 12:01:04 crc kubenswrapper[4742]: I0317 12:01:04.676557 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 12:01:04 crc kubenswrapper[4742]: I0317 12:01:04.676593 4742 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 17 12:01:04 crc kubenswrapper[4742]: I0317 12:01:04.676608 4742 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 17 12:01:04 crc kubenswrapper[4742]: I0317 12:01:04.676620 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52x9h\" (UniqueName: \"kubernetes.io/projected/7cfb9cd7-2718-4547-a238-e62cfa4f3cb5-kube-api-access-52x9h\") on node \"crc\" DevicePath \"\"" Mar 17 12:01:05 crc kubenswrapper[4742]: I0317 12:01:05.161209 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29562481-pwxss" event={"ID":"7cfb9cd7-2718-4547-a238-e62cfa4f3cb5","Type":"ContainerDied","Data":"fe0ba4558088b1ceeafc819a85fbb6a0b7052c2418dd67d62284a1d3dd13bd58"} Mar 17 12:01:05 crc kubenswrapper[4742]: I0317 12:01:05.161249 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe0ba4558088b1ceeafc819a85fbb6a0b7052c2418dd67d62284a1d3dd13bd58" Mar 17 12:01:05 crc kubenswrapper[4742]: I0317 12:01:05.161272 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29562481-pwxss" Mar 17 12:01:12 crc kubenswrapper[4742]: I0317 12:01:12.663068 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:01:12 crc kubenswrapper[4742]: E0317 12:01:12.664678 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:01:26 crc kubenswrapper[4742]: I0317 12:01:26.663474 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:01:26 crc kubenswrapper[4742]: E0317 12:01:26.664405 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:01:41 crc kubenswrapper[4742]: I0317 12:01:41.664251 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:01:41 crc kubenswrapper[4742]: E0317 12:01:41.665497 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:01:52 crc kubenswrapper[4742]: I0317 12:01:52.663087 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:01:52 crc kubenswrapper[4742]: E0317 12:01:52.663759 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:02:00 crc kubenswrapper[4742]: I0317 12:02:00.163953 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562482-vpxzt"] Mar 17 12:02:00 crc kubenswrapper[4742]: E0317 12:02:00.165008 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cfb9cd7-2718-4547-a238-e62cfa4f3cb5" containerName="keystone-cron" Mar 17 12:02:00 crc kubenswrapper[4742]: I0317 12:02:00.165029 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cfb9cd7-2718-4547-a238-e62cfa4f3cb5" containerName="keystone-cron" Mar 17 12:02:00 crc kubenswrapper[4742]: I0317 12:02:00.165354 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cfb9cd7-2718-4547-a238-e62cfa4f3cb5" containerName="keystone-cron" Mar 17 12:02:00 crc kubenswrapper[4742]: I0317 12:02:00.166120 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562482-vpxzt" Mar 17 12:02:00 crc kubenswrapper[4742]: I0317 12:02:00.171937 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:02:00 crc kubenswrapper[4742]: I0317 12:02:00.171989 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:02:00 crc kubenswrapper[4742]: I0317 12:02:00.178106 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:02:00 crc kubenswrapper[4742]: I0317 12:02:00.188529 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562482-vpxzt"] Mar 17 12:02:00 crc kubenswrapper[4742]: I0317 12:02:00.342651 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdklb\" (UniqueName: \"kubernetes.io/projected/233a66be-bd9a-4210-9832-16f145d41f0a-kube-api-access-hdklb\") pod \"auto-csr-approver-29562482-vpxzt\" (UID: \"233a66be-bd9a-4210-9832-16f145d41f0a\") " pod="openshift-infra/auto-csr-approver-29562482-vpxzt" Mar 17 12:02:00 crc kubenswrapper[4742]: I0317 12:02:00.444880 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdklb\" (UniqueName: \"kubernetes.io/projected/233a66be-bd9a-4210-9832-16f145d41f0a-kube-api-access-hdklb\") pod \"auto-csr-approver-29562482-vpxzt\" (UID: \"233a66be-bd9a-4210-9832-16f145d41f0a\") " pod="openshift-infra/auto-csr-approver-29562482-vpxzt" Mar 17 12:02:00 crc kubenswrapper[4742]: I0317 12:02:00.466876 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdklb\" (UniqueName: \"kubernetes.io/projected/233a66be-bd9a-4210-9832-16f145d41f0a-kube-api-access-hdklb\") pod \"auto-csr-approver-29562482-vpxzt\" (UID: \"233a66be-bd9a-4210-9832-16f145d41f0a\") " pod="openshift-infra/auto-csr-approver-29562482-vpxzt" Mar 17 12:02:00 crc kubenswrapper[4742]: I0317 12:02:00.487683 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562482-vpxzt" Mar 17 12:02:00 crc kubenswrapper[4742]: I0317 12:02:00.948809 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562482-vpxzt"] Mar 17 12:02:01 crc kubenswrapper[4742]: I0317 12:02:01.761860 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562482-vpxzt" event={"ID":"233a66be-bd9a-4210-9832-16f145d41f0a","Type":"ContainerStarted","Data":"b9aa3524f5b2816230e7bbec8a8c541dde833fce68441db107161550de263265"} Mar 17 12:02:02 crc kubenswrapper[4742]: I0317 12:02:02.778838 4742 generic.go:334] "Generic (PLEG): container finished" podID="233a66be-bd9a-4210-9832-16f145d41f0a" containerID="f1a67ed09ec19b9d95269a3b2d125bd7572d3c6f3b8c738e20ad27ae053ade71" exitCode=0 Mar 17 12:02:02 crc kubenswrapper[4742]: I0317 12:02:02.779144 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562482-vpxzt" event={"ID":"233a66be-bd9a-4210-9832-16f145d41f0a","Type":"ContainerDied","Data":"f1a67ed09ec19b9d95269a3b2d125bd7572d3c6f3b8c738e20ad27ae053ade71"} Mar 17 12:02:03 crc kubenswrapper[4742]: I0317 12:02:03.663449 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:02:03 crc kubenswrapper[4742]: E0317 12:02:03.664185 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:02:04 crc kubenswrapper[4742]: I0317 12:02:04.252620 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562482-vpxzt" Mar 17 12:02:04 crc kubenswrapper[4742]: I0317 12:02:04.325611 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdklb\" (UniqueName: \"kubernetes.io/projected/233a66be-bd9a-4210-9832-16f145d41f0a-kube-api-access-hdklb\") pod \"233a66be-bd9a-4210-9832-16f145d41f0a\" (UID: \"233a66be-bd9a-4210-9832-16f145d41f0a\") " Mar 17 12:02:04 crc kubenswrapper[4742]: I0317 12:02:04.330950 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233a66be-bd9a-4210-9832-16f145d41f0a-kube-api-access-hdklb" (OuterVolumeSpecName: "kube-api-access-hdklb") pod "233a66be-bd9a-4210-9832-16f145d41f0a" (UID: "233a66be-bd9a-4210-9832-16f145d41f0a"). InnerVolumeSpecName "kube-api-access-hdklb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:02:04 crc kubenswrapper[4742]: I0317 12:02:04.427668 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdklb\" (UniqueName: \"kubernetes.io/projected/233a66be-bd9a-4210-9832-16f145d41f0a-kube-api-access-hdklb\") on node \"crc\" DevicePath \"\"" Mar 17 12:02:04 crc kubenswrapper[4742]: I0317 12:02:04.804688 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562482-vpxzt" event={"ID":"233a66be-bd9a-4210-9832-16f145d41f0a","Type":"ContainerDied","Data":"b9aa3524f5b2816230e7bbec8a8c541dde833fce68441db107161550de263265"} Mar 17 12:02:04 crc kubenswrapper[4742]: I0317 12:02:04.804753 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9aa3524f5b2816230e7bbec8a8c541dde833fce68441db107161550de263265" Mar 17 12:02:04 crc kubenswrapper[4742]: I0317 12:02:04.804808 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562482-vpxzt" Mar 17 12:02:05 crc kubenswrapper[4742]: I0317 12:02:05.330767 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562476-lq8cp"] Mar 17 12:02:05 crc kubenswrapper[4742]: I0317 12:02:05.343008 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562476-lq8cp"] Mar 17 12:02:06 crc kubenswrapper[4742]: I0317 12:02:06.673471 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c08c12ce-725a-4532-990a-136c1cf8c8a3" path="/var/lib/kubelet/pods/c08c12ce-725a-4532-990a-136c1cf8c8a3/volumes" Mar 17 12:02:18 crc kubenswrapper[4742]: I0317 12:02:18.678897 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:02:18 crc kubenswrapper[4742]: E0317 12:02:18.680059 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:02:31 crc kubenswrapper[4742]: I0317 12:02:31.662551 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:02:31 crc kubenswrapper[4742]: E0317 12:02:31.663359 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:02:42 crc kubenswrapper[4742]: I0317 12:02:42.663850 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:02:42 crc kubenswrapper[4742]: E0317 12:02:42.664677 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:02:56 crc kubenswrapper[4742]: I0317 12:02:56.603468 4742 scope.go:117] "RemoveContainer" containerID="784c9eb64c3ba726b37a43a52b288409f503d47fc21d7380a1bfd3e5b8a2aac0" Mar 17 12:02:57 crc kubenswrapper[4742]: I0317 12:02:57.663093 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:02:57 crc kubenswrapper[4742]: E0317 12:02:57.663776 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:03:11 crc kubenswrapper[4742]: I0317 12:03:11.664084 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:03:11 crc kubenswrapper[4742]: E0317 12:03:11.664976 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:03:22 crc kubenswrapper[4742]: I0317 12:03:22.663612 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:03:22 crc kubenswrapper[4742]: E0317 12:03:22.664874 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:03:35 crc kubenswrapper[4742]: I0317 12:03:35.663751 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:03:35 crc kubenswrapper[4742]: E0317 12:03:35.665179 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:03:46 crc kubenswrapper[4742]: I0317 12:03:46.662887 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:03:46 crc kubenswrapper[4742]: E0317 12:03:46.663807 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:04:00 crc kubenswrapper[4742]: I0317 12:04:00.176934 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562484-w74rx"] Mar 17 12:04:00 crc kubenswrapper[4742]: E0317 12:04:00.178265 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233a66be-bd9a-4210-9832-16f145d41f0a" containerName="oc" Mar 17 12:04:00 crc kubenswrapper[4742]: I0317 12:04:00.178285 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="233a66be-bd9a-4210-9832-16f145d41f0a" containerName="oc" Mar 17 12:04:00 crc kubenswrapper[4742]: I0317 12:04:00.178544 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="233a66be-bd9a-4210-9832-16f145d41f0a" containerName="oc" Mar 17 12:04:00 crc kubenswrapper[4742]: I0317 12:04:00.179363 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562484-w74rx" Mar 17 12:04:00 crc kubenswrapper[4742]: I0317 12:04:00.182755 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:04:00 crc kubenswrapper[4742]: I0317 12:04:00.184575 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:04:00 crc kubenswrapper[4742]: I0317 12:04:00.185863 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:04:00 crc kubenswrapper[4742]: I0317 12:04:00.197122 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562484-w74rx"] Mar 17 12:04:00 crc kubenswrapper[4742]: I0317 12:04:00.289224 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52lh2\" (UniqueName: \"kubernetes.io/projected/d0cc7105-8840-46be-9cf2-a52080296716-kube-api-access-52lh2\") pod \"auto-csr-approver-29562484-w74rx\" (UID: \"d0cc7105-8840-46be-9cf2-a52080296716\") " pod="openshift-infra/auto-csr-approver-29562484-w74rx" Mar 17 12:04:00 crc kubenswrapper[4742]: I0317 12:04:00.391473 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52lh2\" (UniqueName: \"kubernetes.io/projected/d0cc7105-8840-46be-9cf2-a52080296716-kube-api-access-52lh2\") pod \"auto-csr-approver-29562484-w74rx\" (UID: \"d0cc7105-8840-46be-9cf2-a52080296716\") " pod="openshift-infra/auto-csr-approver-29562484-w74rx" Mar 17 12:04:00 crc kubenswrapper[4742]: I0317 12:04:00.415866 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52lh2\" (UniqueName: \"kubernetes.io/projected/d0cc7105-8840-46be-9cf2-a52080296716-kube-api-access-52lh2\") pod \"auto-csr-approver-29562484-w74rx\" (UID: \"d0cc7105-8840-46be-9cf2-a52080296716\") " pod="openshift-infra/auto-csr-approver-29562484-w74rx" Mar 17 12:04:00 crc kubenswrapper[4742]: I0317 12:04:00.516678 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562484-w74rx" Mar 17 12:04:00 crc kubenswrapper[4742]: I0317 12:04:00.968460 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562484-w74rx"] Mar 17 12:04:01 crc kubenswrapper[4742]: I0317 12:04:01.532764 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562484-w74rx" event={"ID":"d0cc7105-8840-46be-9cf2-a52080296716","Type":"ContainerStarted","Data":"2c2187868dcc4f7e2ff484eb2afa747cc6a50e5ce7e38f8e6cf413a7f77c2d0f"} Mar 17 12:04:01 crc kubenswrapper[4742]: I0317 12:04:01.663533 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:04:01 crc kubenswrapper[4742]: E0317 12:04:01.664155 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:04:02 crc kubenswrapper[4742]: I0317 12:04:02.543965 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562484-w74rx" event={"ID":"d0cc7105-8840-46be-9cf2-a52080296716","Type":"ContainerStarted","Data":"ea61e793723da8f929adfd34c5a03b15fadd8b901ade8d0ae8e71b3824e9973d"} Mar 17 12:04:02 crc kubenswrapper[4742]: I0317 12:04:02.561829 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562484-w74rx" podStartSLOduration=1.487448852 podStartE2EDuration="2.561809765s" podCreationTimestamp="2026-03-17 12:04:00 +0000 UTC" firstStartedPulling="2026-03-17 12:04:00.975009158 +0000 UTC m=+3144.101136926" lastFinishedPulling="2026-03-17 12:04:02.049370081 +0000 UTC m=+3145.175497839" observedRunningTime="2026-03-17 12:04:02.555114983 +0000 UTC m=+3145.681242751" watchObservedRunningTime="2026-03-17 12:04:02.561809765 +0000 UTC m=+3145.687937523" Mar 17 12:04:03 crc kubenswrapper[4742]: I0317 12:04:03.554773 4742 generic.go:334] "Generic (PLEG): container finished" podID="d0cc7105-8840-46be-9cf2-a52080296716" containerID="ea61e793723da8f929adfd34c5a03b15fadd8b901ade8d0ae8e71b3824e9973d" exitCode=0 Mar 17 12:04:03 crc kubenswrapper[4742]: I0317 12:04:03.554811 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562484-w74rx" event={"ID":"d0cc7105-8840-46be-9cf2-a52080296716","Type":"ContainerDied","Data":"ea61e793723da8f929adfd34c5a03b15fadd8b901ade8d0ae8e71b3824e9973d"} Mar 17 12:04:05 crc kubenswrapper[4742]: I0317 12:04:05.015967 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562484-w74rx" Mar 17 12:04:05 crc kubenswrapper[4742]: I0317 12:04:05.083551 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52lh2\" (UniqueName: \"kubernetes.io/projected/d0cc7105-8840-46be-9cf2-a52080296716-kube-api-access-52lh2\") pod \"d0cc7105-8840-46be-9cf2-a52080296716\" (UID: \"d0cc7105-8840-46be-9cf2-a52080296716\") " Mar 17 12:04:05 crc kubenswrapper[4742]: I0317 12:04:05.091868 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0cc7105-8840-46be-9cf2-a52080296716-kube-api-access-52lh2" (OuterVolumeSpecName: "kube-api-access-52lh2") pod "d0cc7105-8840-46be-9cf2-a52080296716" (UID: "d0cc7105-8840-46be-9cf2-a52080296716"). InnerVolumeSpecName "kube-api-access-52lh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:04:05 crc kubenswrapper[4742]: I0317 12:04:05.185778 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52lh2\" (UniqueName: \"kubernetes.io/projected/d0cc7105-8840-46be-9cf2-a52080296716-kube-api-access-52lh2\") on node \"crc\" DevicePath \"\"" Mar 17 12:04:05 crc kubenswrapper[4742]: I0317 12:04:05.577456 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562484-w74rx" event={"ID":"d0cc7105-8840-46be-9cf2-a52080296716","Type":"ContainerDied","Data":"2c2187868dcc4f7e2ff484eb2afa747cc6a50e5ce7e38f8e6cf413a7f77c2d0f"} Mar 17 12:04:05 crc kubenswrapper[4742]: I0317 12:04:05.577526 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c2187868dcc4f7e2ff484eb2afa747cc6a50e5ce7e38f8e6cf413a7f77c2d0f" Mar 17 12:04:05 crc kubenswrapper[4742]: I0317 12:04:05.577542 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562484-w74rx" Mar 17 12:04:05 crc kubenswrapper[4742]: I0317 12:04:05.651963 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562478-47wq6"] Mar 17 12:04:05 crc kubenswrapper[4742]: I0317 12:04:05.662271 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562478-47wq6"] Mar 17 12:04:06 crc kubenswrapper[4742]: I0317 12:04:06.693170 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="610be10f-b292-4196-bea3-8a5de3e562c3" path="/var/lib/kubelet/pods/610be10f-b292-4196-bea3-8a5de3e562c3/volumes" Mar 17 12:04:16 crc kubenswrapper[4742]: I0317 12:04:16.663320 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:04:16 crc kubenswrapper[4742]: E0317 12:04:16.664361 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:04:28 crc kubenswrapper[4742]: I0317 12:04:28.675630 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:04:28 crc kubenswrapper[4742]: E0317 12:04:28.678052 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.421604 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jdm8k"] Mar 17 12:04:34 crc kubenswrapper[4742]: E0317 12:04:34.422383 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0cc7105-8840-46be-9cf2-a52080296716" containerName="oc" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.422394 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0cc7105-8840-46be-9cf2-a52080296716" containerName="oc" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.422591 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0cc7105-8840-46be-9cf2-a52080296716" containerName="oc" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.423811 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.436877 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jdm8k"] Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.571437 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a140421-46d2-46bf-afd0-ea659696c39e-catalog-content\") pod \"redhat-marketplace-jdm8k\" (UID: \"5a140421-46d2-46bf-afd0-ea659696c39e\") " pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.571939 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxjgh\" (UniqueName: \"kubernetes.io/projected/5a140421-46d2-46bf-afd0-ea659696c39e-kube-api-access-vxjgh\") pod \"redhat-marketplace-jdm8k\" (UID: \"5a140421-46d2-46bf-afd0-ea659696c39e\") " pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.572160 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a140421-46d2-46bf-afd0-ea659696c39e-utilities\") pod \"redhat-marketplace-jdm8k\" (UID: \"5a140421-46d2-46bf-afd0-ea659696c39e\") " pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.680473 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a140421-46d2-46bf-afd0-ea659696c39e-utilities\") pod \"redhat-marketplace-jdm8k\" (UID: \"5a140421-46d2-46bf-afd0-ea659696c39e\") " pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.680896 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a140421-46d2-46bf-afd0-ea659696c39e-catalog-content\") pod \"redhat-marketplace-jdm8k\" (UID: \"5a140421-46d2-46bf-afd0-ea659696c39e\") " pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.681066 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxjgh\" (UniqueName: \"kubernetes.io/projected/5a140421-46d2-46bf-afd0-ea659696c39e-kube-api-access-vxjgh\") pod \"redhat-marketplace-jdm8k\" (UID: \"5a140421-46d2-46bf-afd0-ea659696c39e\") " pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.681160 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a140421-46d2-46bf-afd0-ea659696c39e-utilities\") pod \"redhat-marketplace-jdm8k\" (UID: \"5a140421-46d2-46bf-afd0-ea659696c39e\") " pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.681265 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a140421-46d2-46bf-afd0-ea659696c39e-catalog-content\") pod \"redhat-marketplace-jdm8k\" (UID: \"5a140421-46d2-46bf-afd0-ea659696c39e\") " pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.708749 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxjgh\" (UniqueName: \"kubernetes.io/projected/5a140421-46d2-46bf-afd0-ea659696c39e-kube-api-access-vxjgh\") pod \"redhat-marketplace-jdm8k\" (UID: \"5a140421-46d2-46bf-afd0-ea659696c39e\") " pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:34 crc kubenswrapper[4742]: I0317 12:04:34.747359 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:35 crc kubenswrapper[4742]: I0317 12:04:35.251018 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jdm8k"] Mar 17 12:04:35 crc kubenswrapper[4742]: I0317 12:04:35.912308 4742 generic.go:334] "Generic (PLEG): container finished" podID="5a140421-46d2-46bf-afd0-ea659696c39e" containerID="e79a3f52c5022e2d2072115c98c9cc4b8f65de1ed18d6b766e4ff3de81058cd4" exitCode=0 Mar 17 12:04:35 crc kubenswrapper[4742]: I0317 12:04:35.912382 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jdm8k" event={"ID":"5a140421-46d2-46bf-afd0-ea659696c39e","Type":"ContainerDied","Data":"e79a3f52c5022e2d2072115c98c9cc4b8f65de1ed18d6b766e4ff3de81058cd4"} Mar 17 12:04:35 crc kubenswrapper[4742]: I0317 12:04:35.912779 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jdm8k" event={"ID":"5a140421-46d2-46bf-afd0-ea659696c39e","Type":"ContainerStarted","Data":"bbe2d7aa0e9ea232e60413fdb6df04b7960fb18f41ae588f74eabf0eac28cd2f"} Mar 17 12:04:35 crc kubenswrapper[4742]: I0317 12:04:35.918528 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 12:04:36 crc kubenswrapper[4742]: I0317 12:04:36.922478 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jdm8k" event={"ID":"5a140421-46d2-46bf-afd0-ea659696c39e","Type":"ContainerStarted","Data":"8ea0131e34dbbe47e0c267836570905ef59c3fd0ff460007891d9ae840542e6e"} Mar 17 12:04:37 crc kubenswrapper[4742]: I0317 12:04:37.936808 4742 generic.go:334] "Generic (PLEG): container finished" podID="5a140421-46d2-46bf-afd0-ea659696c39e" containerID="8ea0131e34dbbe47e0c267836570905ef59c3fd0ff460007891d9ae840542e6e" exitCode=0 Mar 17 12:04:37 crc kubenswrapper[4742]: I0317 12:04:37.936886 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jdm8k" event={"ID":"5a140421-46d2-46bf-afd0-ea659696c39e","Type":"ContainerDied","Data":"8ea0131e34dbbe47e0c267836570905ef59c3fd0ff460007891d9ae840542e6e"} Mar 17 12:04:38 crc kubenswrapper[4742]: I0317 12:04:38.952378 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jdm8k" event={"ID":"5a140421-46d2-46bf-afd0-ea659696c39e","Type":"ContainerStarted","Data":"086446d56dcf58182e48e9e2d1cd8177824862bae41e66202a706759ac22b546"} Mar 17 12:04:38 crc kubenswrapper[4742]: I0317 12:04:38.982021 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jdm8k" podStartSLOduration=2.528406196 podStartE2EDuration="4.982002633s" podCreationTimestamp="2026-03-17 12:04:34 +0000 UTC" firstStartedPulling="2026-03-17 12:04:35.917827473 +0000 UTC m=+3179.043955271" lastFinishedPulling="2026-03-17 12:04:38.37142394 +0000 UTC m=+3181.497551708" observedRunningTime="2026-03-17 12:04:38.977275743 +0000 UTC m=+3182.103403501" watchObservedRunningTime="2026-03-17 12:04:38.982002633 +0000 UTC m=+3182.108130391" Mar 17 12:04:39 crc kubenswrapper[4742]: I0317 12:04:39.662600 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:04:39 crc kubenswrapper[4742]: E0317 12:04:39.662842 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:04:44 crc kubenswrapper[4742]: I0317 12:04:44.748196 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:44 crc kubenswrapper[4742]: I0317 12:04:44.749009 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:44 crc kubenswrapper[4742]: I0317 12:04:44.808359 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:45 crc kubenswrapper[4742]: I0317 12:04:45.097401 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:45 crc kubenswrapper[4742]: I0317 12:04:45.144036 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jdm8k"] Mar 17 12:04:47 crc kubenswrapper[4742]: I0317 12:04:47.053143 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jdm8k" podUID="5a140421-46d2-46bf-afd0-ea659696c39e" containerName="registry-server" containerID="cri-o://086446d56dcf58182e48e9e2d1cd8177824862bae41e66202a706759ac22b546" gracePeriod=2 Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.069055 4742 generic.go:334] "Generic (PLEG): container finished" podID="5a140421-46d2-46bf-afd0-ea659696c39e" containerID="086446d56dcf58182e48e9e2d1cd8177824862bae41e66202a706759ac22b546" exitCode=0 Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.069123 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jdm8k" event={"ID":"5a140421-46d2-46bf-afd0-ea659696c39e","Type":"ContainerDied","Data":"086446d56dcf58182e48e9e2d1cd8177824862bae41e66202a706759ac22b546"} Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.069623 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jdm8k" event={"ID":"5a140421-46d2-46bf-afd0-ea659696c39e","Type":"ContainerDied","Data":"bbe2d7aa0e9ea232e60413fdb6df04b7960fb18f41ae588f74eabf0eac28cd2f"} Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.069642 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbe2d7aa0e9ea232e60413fdb6df04b7960fb18f41ae588f74eabf0eac28cd2f" Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.079647 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.274460 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxjgh\" (UniqueName: \"kubernetes.io/projected/5a140421-46d2-46bf-afd0-ea659696c39e-kube-api-access-vxjgh\") pod \"5a140421-46d2-46bf-afd0-ea659696c39e\" (UID: \"5a140421-46d2-46bf-afd0-ea659696c39e\") " Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.274544 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a140421-46d2-46bf-afd0-ea659696c39e-utilities\") pod \"5a140421-46d2-46bf-afd0-ea659696c39e\" (UID: \"5a140421-46d2-46bf-afd0-ea659696c39e\") " Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.274715 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a140421-46d2-46bf-afd0-ea659696c39e-catalog-content\") pod \"5a140421-46d2-46bf-afd0-ea659696c39e\" (UID: \"5a140421-46d2-46bf-afd0-ea659696c39e\") " Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.275800 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a140421-46d2-46bf-afd0-ea659696c39e-utilities" (OuterVolumeSpecName: "utilities") pod "5a140421-46d2-46bf-afd0-ea659696c39e" (UID: "5a140421-46d2-46bf-afd0-ea659696c39e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.284119 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a140421-46d2-46bf-afd0-ea659696c39e-kube-api-access-vxjgh" (OuterVolumeSpecName: "kube-api-access-vxjgh") pod "5a140421-46d2-46bf-afd0-ea659696c39e" (UID: "5a140421-46d2-46bf-afd0-ea659696c39e"). InnerVolumeSpecName "kube-api-access-vxjgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.316182 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a140421-46d2-46bf-afd0-ea659696c39e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a140421-46d2-46bf-afd0-ea659696c39e" (UID: "5a140421-46d2-46bf-afd0-ea659696c39e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.378226 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a140421-46d2-46bf-afd0-ea659696c39e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.378283 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxjgh\" (UniqueName: \"kubernetes.io/projected/5a140421-46d2-46bf-afd0-ea659696c39e-kube-api-access-vxjgh\") on node \"crc\" DevicePath \"\"" Mar 17 12:04:48 crc kubenswrapper[4742]: I0317 12:04:48.378306 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a140421-46d2-46bf-afd0-ea659696c39e-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 12:04:49 crc kubenswrapper[4742]: I0317 12:04:49.085777 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jdm8k" Mar 17 12:04:49 crc kubenswrapper[4742]: I0317 12:04:49.124639 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jdm8k"] Mar 17 12:04:49 crc kubenswrapper[4742]: I0317 12:04:49.133850 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jdm8k"] Mar 17 12:04:50 crc kubenswrapper[4742]: I0317 12:04:50.677268 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a140421-46d2-46bf-afd0-ea659696c39e" path="/var/lib/kubelet/pods/5a140421-46d2-46bf-afd0-ea659696c39e/volumes" Mar 17 12:04:52 crc kubenswrapper[4742]: I0317 12:04:52.663299 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:04:52 crc kubenswrapper[4742]: E0317 12:04:52.664047 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.227670 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2bfcd"] Mar 17 12:04:56 crc kubenswrapper[4742]: E0317 12:04:56.228788 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a140421-46d2-46bf-afd0-ea659696c39e" containerName="registry-server" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.228802 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a140421-46d2-46bf-afd0-ea659696c39e" containerName="registry-server" Mar 17 12:04:56 crc kubenswrapper[4742]: E0317 12:04:56.228816 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a140421-46d2-46bf-afd0-ea659696c39e" containerName="extract-utilities" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.228824 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a140421-46d2-46bf-afd0-ea659696c39e" containerName="extract-utilities" Mar 17 12:04:56 crc kubenswrapper[4742]: E0317 12:04:56.228837 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a140421-46d2-46bf-afd0-ea659696c39e" containerName="extract-content" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.228843 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a140421-46d2-46bf-afd0-ea659696c39e" containerName="extract-content" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.229079 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a140421-46d2-46bf-afd0-ea659696c39e" containerName="registry-server" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.230420 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.252000 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2bfcd"] Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.344195 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4z9g\" (UniqueName: \"kubernetes.io/projected/0a2751f3-8411-42aa-af84-ebc5239c6c8d-kube-api-access-w4z9g\") pod \"community-operators-2bfcd\" (UID: \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\") " pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.344255 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a2751f3-8411-42aa-af84-ebc5239c6c8d-utilities\") pod \"community-operators-2bfcd\" (UID: \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\") " pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.344308 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a2751f3-8411-42aa-af84-ebc5239c6c8d-catalog-content\") pod \"community-operators-2bfcd\" (UID: \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\") " pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.446572 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4z9g\" (UniqueName: \"kubernetes.io/projected/0a2751f3-8411-42aa-af84-ebc5239c6c8d-kube-api-access-w4z9g\") pod \"community-operators-2bfcd\" (UID: \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\") " pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.446651 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a2751f3-8411-42aa-af84-ebc5239c6c8d-utilities\") pod \"community-operators-2bfcd\" (UID: \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\") " pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.446716 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a2751f3-8411-42aa-af84-ebc5239c6c8d-catalog-content\") pod \"community-operators-2bfcd\" (UID: \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\") " pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.447253 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a2751f3-8411-42aa-af84-ebc5239c6c8d-catalog-content\") pod \"community-operators-2bfcd\" (UID: \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\") " pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.447776 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a2751f3-8411-42aa-af84-ebc5239c6c8d-utilities\") pod \"community-operators-2bfcd\" (UID: \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\") " pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.464635 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4z9g\" (UniqueName: \"kubernetes.io/projected/0a2751f3-8411-42aa-af84-ebc5239c6c8d-kube-api-access-w4z9g\") pod \"community-operators-2bfcd\" (UID: \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\") " pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.561563 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:04:56 crc kubenswrapper[4742]: I0317 12:04:56.712880 4742 scope.go:117] "RemoveContainer" containerID="2e2a9b61ee805dab5cce2a2a8e7e4a8772f1b8d7c411a195cbc4ca953f251b23" Mar 17 12:04:57 crc kubenswrapper[4742]: I0317 12:04:57.159257 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2bfcd"] Mar 17 12:04:57 crc kubenswrapper[4742]: I0317 12:04:57.198545 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bfcd" event={"ID":"0a2751f3-8411-42aa-af84-ebc5239c6c8d","Type":"ContainerStarted","Data":"bfd5780e47bd634064f58feecd898f31ba269a28cee786d89d4082aadd495255"} Mar 17 12:04:58 crc kubenswrapper[4742]: I0317 12:04:58.211365 4742 generic.go:334] "Generic (PLEG): container finished" podID="0a2751f3-8411-42aa-af84-ebc5239c6c8d" containerID="35bd79335f79b2479261f2afc669ef4bbeee95cac481ea86781e170f1032a092" exitCode=0 Mar 17 12:04:58 crc kubenswrapper[4742]: I0317 12:04:58.211488 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bfcd" event={"ID":"0a2751f3-8411-42aa-af84-ebc5239c6c8d","Type":"ContainerDied","Data":"35bd79335f79b2479261f2afc669ef4bbeee95cac481ea86781e170f1032a092"} Mar 17 12:04:59 crc kubenswrapper[4742]: I0317 12:04:59.223553 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bfcd" event={"ID":"0a2751f3-8411-42aa-af84-ebc5239c6c8d","Type":"ContainerStarted","Data":"a17ef7bcdc3006e42849b0d52af5e04a6faa513fe3774da04883d90ad26a4c1e"} Mar 17 12:05:01 crc kubenswrapper[4742]: I0317 12:05:01.245895 4742 generic.go:334] "Generic (PLEG): container finished" podID="0a2751f3-8411-42aa-af84-ebc5239c6c8d" containerID="a17ef7bcdc3006e42849b0d52af5e04a6faa513fe3774da04883d90ad26a4c1e" exitCode=0 Mar 17 12:05:01 crc kubenswrapper[4742]: I0317 12:05:01.245975 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bfcd" event={"ID":"0a2751f3-8411-42aa-af84-ebc5239c6c8d","Type":"ContainerDied","Data":"a17ef7bcdc3006e42849b0d52af5e04a6faa513fe3774da04883d90ad26a4c1e"} Mar 17 12:05:02 crc kubenswrapper[4742]: I0317 12:05:02.274336 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bfcd" event={"ID":"0a2751f3-8411-42aa-af84-ebc5239c6c8d","Type":"ContainerStarted","Data":"5a5ffcddd43cd87aefc700ca4f89397b6800c2de9731d88d5f95223305009338"} Mar 17 12:05:02 crc kubenswrapper[4742]: I0317 12:05:02.306285 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2bfcd" podStartSLOduration=2.896789144 podStartE2EDuration="6.306265932s" podCreationTimestamp="2026-03-17 12:04:56 +0000 UTC" firstStartedPulling="2026-03-17 12:04:58.214643705 +0000 UTC m=+3201.340771463" lastFinishedPulling="2026-03-17 12:05:01.624120503 +0000 UTC m=+3204.750248251" observedRunningTime="2026-03-17 12:05:02.305319216 +0000 UTC m=+3205.431446984" watchObservedRunningTime="2026-03-17 12:05:02.306265932 +0000 UTC m=+3205.432393690" Mar 17 12:05:05 crc kubenswrapper[4742]: I0317 12:05:05.662424 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:05:05 crc kubenswrapper[4742]: E0317 12:05:05.663202 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:05:06 crc kubenswrapper[4742]: I0317 12:05:06.562118 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:05:06 crc kubenswrapper[4742]: I0317 12:05:06.562523 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:05:06 crc kubenswrapper[4742]: I0317 12:05:06.647769 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:05:07 crc kubenswrapper[4742]: I0317 12:05:07.386253 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:05:07 crc kubenswrapper[4742]: I0317 12:05:07.452498 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2bfcd"] Mar 17 12:05:09 crc kubenswrapper[4742]: I0317 12:05:09.353627 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2bfcd" podUID="0a2751f3-8411-42aa-af84-ebc5239c6c8d" containerName="registry-server" containerID="cri-o://5a5ffcddd43cd87aefc700ca4f89397b6800c2de9731d88d5f95223305009338" gracePeriod=2 Mar 17 12:05:09 crc kubenswrapper[4742]: I0317 12:05:09.819657 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:05:09 crc kubenswrapper[4742]: I0317 12:05:09.906688 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4z9g\" (UniqueName: \"kubernetes.io/projected/0a2751f3-8411-42aa-af84-ebc5239c6c8d-kube-api-access-w4z9g\") pod \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\" (UID: \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\") " Mar 17 12:05:09 crc kubenswrapper[4742]: I0317 12:05:09.906809 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a2751f3-8411-42aa-af84-ebc5239c6c8d-catalog-content\") pod \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\" (UID: \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\") " Mar 17 12:05:09 crc kubenswrapper[4742]: I0317 12:05:09.906945 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a2751f3-8411-42aa-af84-ebc5239c6c8d-utilities\") pod \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\" (UID: \"0a2751f3-8411-42aa-af84-ebc5239c6c8d\") " Mar 17 12:05:09 crc kubenswrapper[4742]: I0317 12:05:09.911141 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a2751f3-8411-42aa-af84-ebc5239c6c8d-utilities" (OuterVolumeSpecName: "utilities") pod "0a2751f3-8411-42aa-af84-ebc5239c6c8d" (UID: "0a2751f3-8411-42aa-af84-ebc5239c6c8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:05:09 crc kubenswrapper[4742]: I0317 12:05:09.913167 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a2751f3-8411-42aa-af84-ebc5239c6c8d-kube-api-access-w4z9g" (OuterVolumeSpecName: "kube-api-access-w4z9g") pod "0a2751f3-8411-42aa-af84-ebc5239c6c8d" (UID: "0a2751f3-8411-42aa-af84-ebc5239c6c8d"). InnerVolumeSpecName "kube-api-access-w4z9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:05:09 crc kubenswrapper[4742]: I0317 12:05:09.961780 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a2751f3-8411-42aa-af84-ebc5239c6c8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a2751f3-8411-42aa-af84-ebc5239c6c8d" (UID: "0a2751f3-8411-42aa-af84-ebc5239c6c8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.008702 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a2751f3-8411-42aa-af84-ebc5239c6c8d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.008732 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a2751f3-8411-42aa-af84-ebc5239c6c8d-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.008747 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4z9g\" (UniqueName: \"kubernetes.io/projected/0a2751f3-8411-42aa-af84-ebc5239c6c8d-kube-api-access-w4z9g\") on node \"crc\" DevicePath \"\"" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.366725 4742 generic.go:334] "Generic (PLEG): container finished" podID="0a2751f3-8411-42aa-af84-ebc5239c6c8d" containerID="5a5ffcddd43cd87aefc700ca4f89397b6800c2de9731d88d5f95223305009338" exitCode=0 Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.366784 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bfcd" event={"ID":"0a2751f3-8411-42aa-af84-ebc5239c6c8d","Type":"ContainerDied","Data":"5a5ffcddd43cd87aefc700ca4f89397b6800c2de9731d88d5f95223305009338"} Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.368999 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2bfcd" event={"ID":"0a2751f3-8411-42aa-af84-ebc5239c6c8d","Type":"ContainerDied","Data":"bfd5780e47bd634064f58feecd898f31ba269a28cee786d89d4082aadd495255"} Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.369034 4742 scope.go:117] "RemoveContainer" containerID="5a5ffcddd43cd87aefc700ca4f89397b6800c2de9731d88d5f95223305009338" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.366864 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2bfcd" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.394817 4742 scope.go:117] "RemoveContainer" containerID="a17ef7bcdc3006e42849b0d52af5e04a6faa513fe3774da04883d90ad26a4c1e" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.424543 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2bfcd"] Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.428704 4742 scope.go:117] "RemoveContainer" containerID="35bd79335f79b2479261f2afc669ef4bbeee95cac481ea86781e170f1032a092" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.430046 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2bfcd"] Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.486346 4742 scope.go:117] "RemoveContainer" containerID="5a5ffcddd43cd87aefc700ca4f89397b6800c2de9731d88d5f95223305009338" Mar 17 12:05:10 crc kubenswrapper[4742]: E0317 12:05:10.486692 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a5ffcddd43cd87aefc700ca4f89397b6800c2de9731d88d5f95223305009338\": container with ID starting with 5a5ffcddd43cd87aefc700ca4f89397b6800c2de9731d88d5f95223305009338 not found: ID does not exist" containerID="5a5ffcddd43cd87aefc700ca4f89397b6800c2de9731d88d5f95223305009338" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.486744 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a5ffcddd43cd87aefc700ca4f89397b6800c2de9731d88d5f95223305009338"} err="failed to get container status \"5a5ffcddd43cd87aefc700ca4f89397b6800c2de9731d88d5f95223305009338\": rpc error: code = NotFound desc = could not find container \"5a5ffcddd43cd87aefc700ca4f89397b6800c2de9731d88d5f95223305009338\": container with ID starting with 5a5ffcddd43cd87aefc700ca4f89397b6800c2de9731d88d5f95223305009338 not found: ID does not exist" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.486777 4742 scope.go:117] "RemoveContainer" containerID="a17ef7bcdc3006e42849b0d52af5e04a6faa513fe3774da04883d90ad26a4c1e" Mar 17 12:05:10 crc kubenswrapper[4742]: E0317 12:05:10.487699 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a17ef7bcdc3006e42849b0d52af5e04a6faa513fe3774da04883d90ad26a4c1e\": container with ID starting with a17ef7bcdc3006e42849b0d52af5e04a6faa513fe3774da04883d90ad26a4c1e not found: ID does not exist" containerID="a17ef7bcdc3006e42849b0d52af5e04a6faa513fe3774da04883d90ad26a4c1e" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.487734 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a17ef7bcdc3006e42849b0d52af5e04a6faa513fe3774da04883d90ad26a4c1e"} err="failed to get container status \"a17ef7bcdc3006e42849b0d52af5e04a6faa513fe3774da04883d90ad26a4c1e\": rpc error: code = NotFound desc = could not find container \"a17ef7bcdc3006e42849b0d52af5e04a6faa513fe3774da04883d90ad26a4c1e\": container with ID starting with a17ef7bcdc3006e42849b0d52af5e04a6faa513fe3774da04883d90ad26a4c1e not found: ID does not exist" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.487754 4742 scope.go:117] "RemoveContainer" containerID="35bd79335f79b2479261f2afc669ef4bbeee95cac481ea86781e170f1032a092" Mar 17 12:05:10 crc kubenswrapper[4742]: E0317 12:05:10.488122 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35bd79335f79b2479261f2afc669ef4bbeee95cac481ea86781e170f1032a092\": container with ID starting with 35bd79335f79b2479261f2afc669ef4bbeee95cac481ea86781e170f1032a092 not found: ID does not exist" containerID="35bd79335f79b2479261f2afc669ef4bbeee95cac481ea86781e170f1032a092" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.488157 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35bd79335f79b2479261f2afc669ef4bbeee95cac481ea86781e170f1032a092"} err="failed to get container status \"35bd79335f79b2479261f2afc669ef4bbeee95cac481ea86781e170f1032a092\": rpc error: code = NotFound desc = could not find container \"35bd79335f79b2479261f2afc669ef4bbeee95cac481ea86781e170f1032a092\": container with ID starting with 35bd79335f79b2479261f2afc669ef4bbeee95cac481ea86781e170f1032a092 not found: ID does not exist" Mar 17 12:05:10 crc kubenswrapper[4742]: I0317 12:05:10.673836 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a2751f3-8411-42aa-af84-ebc5239c6c8d" path="/var/lib/kubelet/pods/0a2751f3-8411-42aa-af84-ebc5239c6c8d/volumes" Mar 17 12:05:20 crc kubenswrapper[4742]: I0317 12:05:20.663156 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:05:20 crc kubenswrapper[4742]: E0317 12:05:20.664526 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:05:33 crc kubenswrapper[4742]: I0317 12:05:33.663508 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:05:33 crc kubenswrapper[4742]: E0317 12:05:33.664419 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:05:44 crc kubenswrapper[4742]: I0317 12:05:44.663406 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:05:44 crc kubenswrapper[4742]: E0317 12:05:44.664671 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:05:59 crc kubenswrapper[4742]: I0317 12:05:59.662866 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:05:59 crc kubenswrapper[4742]: I0317 12:05:59.977231 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"5629d3aafca0c84011d832a91e511d2d987d5f8c9f7e5232caeaae344d4b0a53"} Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.171641 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562486-hrrqx"] Mar 17 12:06:00 crc kubenswrapper[4742]: E0317 12:06:00.172254 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2751f3-8411-42aa-af84-ebc5239c6c8d" containerName="extract-content" Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.172284 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2751f3-8411-42aa-af84-ebc5239c6c8d" containerName="extract-content" Mar 17 12:06:00 crc kubenswrapper[4742]: E0317 12:06:00.172328 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2751f3-8411-42aa-af84-ebc5239c6c8d" containerName="registry-server" Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.172341 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2751f3-8411-42aa-af84-ebc5239c6c8d" containerName="registry-server" Mar 17 12:06:00 crc kubenswrapper[4742]: E0317 12:06:00.172353 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2751f3-8411-42aa-af84-ebc5239c6c8d" containerName="extract-utilities" Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.172364 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2751f3-8411-42aa-af84-ebc5239c6c8d" containerName="extract-utilities" Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.172710 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a2751f3-8411-42aa-af84-ebc5239c6c8d" containerName="registry-server" Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.173743 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562486-hrrqx" Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.176614 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.177863 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.178125 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.180365 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562486-hrrqx"] Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.278781 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk94r\" (UniqueName: \"kubernetes.io/projected/a5ab30c1-6a19-43e3-b6c1-fd5003e27c33-kube-api-access-bk94r\") pod \"auto-csr-approver-29562486-hrrqx\" (UID: \"a5ab30c1-6a19-43e3-b6c1-fd5003e27c33\") " pod="openshift-infra/auto-csr-approver-29562486-hrrqx" Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.380878 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk94r\" (UniqueName: \"kubernetes.io/projected/a5ab30c1-6a19-43e3-b6c1-fd5003e27c33-kube-api-access-bk94r\") pod \"auto-csr-approver-29562486-hrrqx\" (UID: \"a5ab30c1-6a19-43e3-b6c1-fd5003e27c33\") " pod="openshift-infra/auto-csr-approver-29562486-hrrqx" Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.403362 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk94r\" (UniqueName: \"kubernetes.io/projected/a5ab30c1-6a19-43e3-b6c1-fd5003e27c33-kube-api-access-bk94r\") pod \"auto-csr-approver-29562486-hrrqx\" (UID: \"a5ab30c1-6a19-43e3-b6c1-fd5003e27c33\") " pod="openshift-infra/auto-csr-approver-29562486-hrrqx" Mar 17 12:06:00 crc kubenswrapper[4742]: I0317 12:06:00.494848 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562486-hrrqx" Mar 17 12:06:01 crc kubenswrapper[4742]: I0317 12:06:01.018476 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562486-hrrqx"] Mar 17 12:06:01 crc kubenswrapper[4742]: W0317 12:06:01.021106 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5ab30c1_6a19_43e3_b6c1_fd5003e27c33.slice/crio-d13c706dbc6d7cc29f0f38e5b8666519e45e15efb8429136e5e7107a5336276c WatchSource:0}: Error finding container d13c706dbc6d7cc29f0f38e5b8666519e45e15efb8429136e5e7107a5336276c: Status 404 returned error can't find the container with id d13c706dbc6d7cc29f0f38e5b8666519e45e15efb8429136e5e7107a5336276c Mar 17 12:06:02 crc kubenswrapper[4742]: I0317 12:06:02.003291 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562486-hrrqx" event={"ID":"a5ab30c1-6a19-43e3-b6c1-fd5003e27c33","Type":"ContainerStarted","Data":"d13c706dbc6d7cc29f0f38e5b8666519e45e15efb8429136e5e7107a5336276c"} Mar 17 12:06:03 crc kubenswrapper[4742]: I0317 12:06:03.015556 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562486-hrrqx" event={"ID":"a5ab30c1-6a19-43e3-b6c1-fd5003e27c33","Type":"ContainerStarted","Data":"927b24b4bc5a0eba7616b925f65e5e5173963560106466454d3fdf5e05542eb0"} Mar 17 12:06:03 crc kubenswrapper[4742]: I0317 12:06:03.034346 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562486-hrrqx" podStartSLOduration=1.548961655 podStartE2EDuration="3.034307703s" podCreationTimestamp="2026-03-17 12:06:00 +0000 UTC" firstStartedPulling="2026-03-17 12:06:01.023215169 +0000 UTC m=+3264.149342967" lastFinishedPulling="2026-03-17 12:06:02.508561237 +0000 UTC m=+3265.634689015" observedRunningTime="2026-03-17 12:06:03.027653072 +0000 UTC m=+3266.153780870" watchObservedRunningTime="2026-03-17 12:06:03.034307703 +0000 UTC m=+3266.160435501" Mar 17 12:06:04 crc kubenswrapper[4742]: I0317 12:06:04.033828 4742 generic.go:334] "Generic (PLEG): container finished" podID="a5ab30c1-6a19-43e3-b6c1-fd5003e27c33" containerID="927b24b4bc5a0eba7616b925f65e5e5173963560106466454d3fdf5e05542eb0" exitCode=0 Mar 17 12:06:04 crc kubenswrapper[4742]: I0317 12:06:04.033977 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562486-hrrqx" event={"ID":"a5ab30c1-6a19-43e3-b6c1-fd5003e27c33","Type":"ContainerDied","Data":"927b24b4bc5a0eba7616b925f65e5e5173963560106466454d3fdf5e05542eb0"} Mar 17 12:06:05 crc kubenswrapper[4742]: I0317 12:06:05.535774 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562486-hrrqx" Mar 17 12:06:05 crc kubenswrapper[4742]: I0317 12:06:05.684236 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk94r\" (UniqueName: \"kubernetes.io/projected/a5ab30c1-6a19-43e3-b6c1-fd5003e27c33-kube-api-access-bk94r\") pod \"a5ab30c1-6a19-43e3-b6c1-fd5003e27c33\" (UID: \"a5ab30c1-6a19-43e3-b6c1-fd5003e27c33\") " Mar 17 12:06:05 crc kubenswrapper[4742]: I0317 12:06:05.692953 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ab30c1-6a19-43e3-b6c1-fd5003e27c33-kube-api-access-bk94r" (OuterVolumeSpecName: "kube-api-access-bk94r") pod "a5ab30c1-6a19-43e3-b6c1-fd5003e27c33" (UID: "a5ab30c1-6a19-43e3-b6c1-fd5003e27c33"). InnerVolumeSpecName "kube-api-access-bk94r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:06:05 crc kubenswrapper[4742]: I0317 12:06:05.787339 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk94r\" (UniqueName: \"kubernetes.io/projected/a5ab30c1-6a19-43e3-b6c1-fd5003e27c33-kube-api-access-bk94r\") on node \"crc\" DevicePath \"\"" Mar 17 12:06:06 crc kubenswrapper[4742]: I0317 12:06:06.061250 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562486-hrrqx" event={"ID":"a5ab30c1-6a19-43e3-b6c1-fd5003e27c33","Type":"ContainerDied","Data":"d13c706dbc6d7cc29f0f38e5b8666519e45e15efb8429136e5e7107a5336276c"} Mar 17 12:06:06 crc kubenswrapper[4742]: I0317 12:06:06.061709 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d13c706dbc6d7cc29f0f38e5b8666519e45e15efb8429136e5e7107a5336276c" Mar 17 12:06:06 crc kubenswrapper[4742]: I0317 12:06:06.061348 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562486-hrrqx" Mar 17 12:06:06 crc kubenswrapper[4742]: I0317 12:06:06.138052 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562480-8l4gb"] Mar 17 12:06:06 crc kubenswrapper[4742]: I0317 12:06:06.150466 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562480-8l4gb"] Mar 17 12:06:06 crc kubenswrapper[4742]: I0317 12:06:06.680202 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff23f76e-4787-4741-a13b-ada68fec94b9" path="/var/lib/kubelet/pods/ff23f76e-4787-4741-a13b-ada68fec94b9/volumes" Mar 17 12:06:56 crc kubenswrapper[4742]: I0317 12:06:56.918733 4742 scope.go:117] "RemoveContainer" containerID="5411daf7000da77b741d2d85d3477e8df997cb414d8d420141d57f72fb9da9fe" Mar 17 12:08:00 crc kubenswrapper[4742]: I0317 12:08:00.174567 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562488-d2k9r"] Mar 17 12:08:00 crc kubenswrapper[4742]: E0317 12:08:00.175668 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ab30c1-6a19-43e3-b6c1-fd5003e27c33" containerName="oc" Mar 17 12:08:00 crc kubenswrapper[4742]: I0317 12:08:00.175684 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ab30c1-6a19-43e3-b6c1-fd5003e27c33" containerName="oc" Mar 17 12:08:00 crc kubenswrapper[4742]: I0317 12:08:00.175988 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ab30c1-6a19-43e3-b6c1-fd5003e27c33" containerName="oc" Mar 17 12:08:00 crc kubenswrapper[4742]: I0317 12:08:00.176739 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562488-d2k9r" Mar 17 12:08:00 crc kubenswrapper[4742]: I0317 12:08:00.182181 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:08:00 crc kubenswrapper[4742]: I0317 12:08:00.182473 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:08:00 crc kubenswrapper[4742]: I0317 12:08:00.182704 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:08:00 crc kubenswrapper[4742]: I0317 12:08:00.199257 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562488-d2k9r"] Mar 17 12:08:00 crc kubenswrapper[4742]: I0317 12:08:00.283387 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6pw8\" (UniqueName: \"kubernetes.io/projected/942e9b43-8c54-4c69-8a59-f8315ce878b8-kube-api-access-c6pw8\") pod \"auto-csr-approver-29562488-d2k9r\" (UID: \"942e9b43-8c54-4c69-8a59-f8315ce878b8\") " pod="openshift-infra/auto-csr-approver-29562488-d2k9r" Mar 17 12:08:00 crc kubenswrapper[4742]: I0317 12:08:00.385264 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6pw8\" (UniqueName: \"kubernetes.io/projected/942e9b43-8c54-4c69-8a59-f8315ce878b8-kube-api-access-c6pw8\") pod \"auto-csr-approver-29562488-d2k9r\" (UID: \"942e9b43-8c54-4c69-8a59-f8315ce878b8\") " pod="openshift-infra/auto-csr-approver-29562488-d2k9r" Mar 17 12:08:00 crc kubenswrapper[4742]: I0317 12:08:00.421614 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6pw8\" (UniqueName: \"kubernetes.io/projected/942e9b43-8c54-4c69-8a59-f8315ce878b8-kube-api-access-c6pw8\") pod \"auto-csr-approver-29562488-d2k9r\" (UID: \"942e9b43-8c54-4c69-8a59-f8315ce878b8\") " pod="openshift-infra/auto-csr-approver-29562488-d2k9r" Mar 17 12:08:00 crc kubenswrapper[4742]: I0317 12:08:00.523219 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562488-d2k9r" Mar 17 12:08:01 crc kubenswrapper[4742]: I0317 12:08:01.008562 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562488-d2k9r"] Mar 17 12:08:01 crc kubenswrapper[4742]: W0317 12:08:01.009490 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod942e9b43_8c54_4c69_8a59_f8315ce878b8.slice/crio-9f7e60261d02e2a488d75ba0a351bb8f99028cc6dd1b67ae073c2970c7ff5643 WatchSource:0}: Error finding container 9f7e60261d02e2a488d75ba0a351bb8f99028cc6dd1b67ae073c2970c7ff5643: Status 404 returned error can't find the container with id 9f7e60261d02e2a488d75ba0a351bb8f99028cc6dd1b67ae073c2970c7ff5643 Mar 17 12:08:01 crc kubenswrapper[4742]: I0317 12:08:01.412693 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562488-d2k9r" event={"ID":"942e9b43-8c54-4c69-8a59-f8315ce878b8","Type":"ContainerStarted","Data":"9f7e60261d02e2a488d75ba0a351bb8f99028cc6dd1b67ae073c2970c7ff5643"} Mar 17 12:08:03 crc kubenswrapper[4742]: I0317 12:08:03.462949 4742 generic.go:334] "Generic (PLEG): container finished" podID="942e9b43-8c54-4c69-8a59-f8315ce878b8" containerID="e43e575827c29d46c43fa664bb9238d3a111f9cd25c6323609e0a2f0d2bdb306" exitCode=0 Mar 17 12:08:03 crc kubenswrapper[4742]: I0317 12:08:03.463114 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562488-d2k9r" event={"ID":"942e9b43-8c54-4c69-8a59-f8315ce878b8","Type":"ContainerDied","Data":"e43e575827c29d46c43fa664bb9238d3a111f9cd25c6323609e0a2f0d2bdb306"} Mar 17 12:08:04 crc kubenswrapper[4742]: I0317 12:08:04.932182 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562488-d2k9r" Mar 17 12:08:05 crc kubenswrapper[4742]: I0317 12:08:05.096018 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6pw8\" (UniqueName: \"kubernetes.io/projected/942e9b43-8c54-4c69-8a59-f8315ce878b8-kube-api-access-c6pw8\") pod \"942e9b43-8c54-4c69-8a59-f8315ce878b8\" (UID: \"942e9b43-8c54-4c69-8a59-f8315ce878b8\") " Mar 17 12:08:05 crc kubenswrapper[4742]: I0317 12:08:05.120875 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/942e9b43-8c54-4c69-8a59-f8315ce878b8-kube-api-access-c6pw8" (OuterVolumeSpecName: "kube-api-access-c6pw8") pod "942e9b43-8c54-4c69-8a59-f8315ce878b8" (UID: "942e9b43-8c54-4c69-8a59-f8315ce878b8"). InnerVolumeSpecName "kube-api-access-c6pw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:08:05 crc kubenswrapper[4742]: I0317 12:08:05.198873 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6pw8\" (UniqueName: \"kubernetes.io/projected/942e9b43-8c54-4c69-8a59-f8315ce878b8-kube-api-access-c6pw8\") on node \"crc\" DevicePath \"\"" Mar 17 12:08:05 crc kubenswrapper[4742]: I0317 12:08:05.491532 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562488-d2k9r" event={"ID":"942e9b43-8c54-4c69-8a59-f8315ce878b8","Type":"ContainerDied","Data":"9f7e60261d02e2a488d75ba0a351bb8f99028cc6dd1b67ae073c2970c7ff5643"} Mar 17 12:08:05 crc kubenswrapper[4742]: I0317 12:08:05.491596 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f7e60261d02e2a488d75ba0a351bb8f99028cc6dd1b67ae073c2970c7ff5643" Mar 17 12:08:05 crc kubenswrapper[4742]: I0317 12:08:05.491683 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562488-d2k9r" Mar 17 12:08:06 crc kubenswrapper[4742]: I0317 12:08:06.000221 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562482-vpxzt"] Mar 17 12:08:06 crc kubenswrapper[4742]: I0317 12:08:06.008531 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562482-vpxzt"] Mar 17 12:08:06 crc kubenswrapper[4742]: I0317 12:08:06.678524 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233a66be-bd9a-4210-9832-16f145d41f0a" path="/var/lib/kubelet/pods/233a66be-bd9a-4210-9832-16f145d41f0a/volumes" Mar 17 12:08:18 crc kubenswrapper[4742]: I0317 12:08:18.044472 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:08:18 crc kubenswrapper[4742]: I0317 12:08:18.045079 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:08:48 crc kubenswrapper[4742]: I0317 12:08:48.044037 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:08:48 crc kubenswrapper[4742]: I0317 12:08:48.044675 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:08:57 crc kubenswrapper[4742]: I0317 12:08:57.045638 4742 scope.go:117] "RemoveContainer" containerID="f1a67ed09ec19b9d95269a3b2d125bd7572d3c6f3b8c738e20ad27ae053ade71" Mar 17 12:09:18 crc kubenswrapper[4742]: I0317 12:09:18.044205 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:09:18 crc kubenswrapper[4742]: I0317 12:09:18.044840 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:09:18 crc kubenswrapper[4742]: I0317 12:09:18.044924 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 12:09:18 crc kubenswrapper[4742]: I0317 12:09:18.045623 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5629d3aafca0c84011d832a91e511d2d987d5f8c9f7e5232caeaae344d4b0a53"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 12:09:18 crc kubenswrapper[4742]: I0317 12:09:18.045691 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://5629d3aafca0c84011d832a91e511d2d987d5f8c9f7e5232caeaae344d4b0a53" gracePeriod=600 Mar 17 12:09:18 crc kubenswrapper[4742]: I0317 12:09:18.325820 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="5629d3aafca0c84011d832a91e511d2d987d5f8c9f7e5232caeaae344d4b0a53" exitCode=0 Mar 17 12:09:18 crc kubenswrapper[4742]: I0317 12:09:18.325866 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"5629d3aafca0c84011d832a91e511d2d987d5f8c9f7e5232caeaae344d4b0a53"} Mar 17 12:09:18 crc kubenswrapper[4742]: I0317 12:09:18.325898 4742 scope.go:117] "RemoveContainer" containerID="291eed816b57dd4ac314ae5810b28ba6bec67bac2355850a5cd2d43fb301ce50" Mar 17 12:09:19 crc kubenswrapper[4742]: I0317 12:09:19.340614 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f"} Mar 17 12:10:00 crc kubenswrapper[4742]: I0317 12:10:00.152728 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562490-qm8k9"] Mar 17 12:10:00 crc kubenswrapper[4742]: E0317 12:10:00.153715 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="942e9b43-8c54-4c69-8a59-f8315ce878b8" containerName="oc" Mar 17 12:10:00 crc kubenswrapper[4742]: I0317 12:10:00.153733 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="942e9b43-8c54-4c69-8a59-f8315ce878b8" containerName="oc" Mar 17 12:10:00 crc kubenswrapper[4742]: I0317 12:10:00.154030 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="942e9b43-8c54-4c69-8a59-f8315ce878b8" containerName="oc" Mar 17 12:10:00 crc kubenswrapper[4742]: I0317 12:10:00.154733 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562490-qm8k9" Mar 17 12:10:00 crc kubenswrapper[4742]: I0317 12:10:00.157580 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:10:00 crc kubenswrapper[4742]: I0317 12:10:00.158208 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:10:00 crc kubenswrapper[4742]: I0317 12:10:00.159835 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:10:00 crc kubenswrapper[4742]: I0317 12:10:00.165197 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562490-qm8k9"] Mar 17 12:10:00 crc kubenswrapper[4742]: I0317 12:10:00.189767 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2rf9\" (UniqueName: \"kubernetes.io/projected/2a1431b3-c664-40ff-971b-d336e91cc3c8-kube-api-access-w2rf9\") pod \"auto-csr-approver-29562490-qm8k9\" (UID: \"2a1431b3-c664-40ff-971b-d336e91cc3c8\") " pod="openshift-infra/auto-csr-approver-29562490-qm8k9" Mar 17 12:10:00 crc kubenswrapper[4742]: I0317 12:10:00.291254 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2rf9\" (UniqueName: \"kubernetes.io/projected/2a1431b3-c664-40ff-971b-d336e91cc3c8-kube-api-access-w2rf9\") pod \"auto-csr-approver-29562490-qm8k9\" (UID: \"2a1431b3-c664-40ff-971b-d336e91cc3c8\") " pod="openshift-infra/auto-csr-approver-29562490-qm8k9" Mar 17 12:10:00 crc kubenswrapper[4742]: I0317 12:10:00.314809 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2rf9\" (UniqueName: \"kubernetes.io/projected/2a1431b3-c664-40ff-971b-d336e91cc3c8-kube-api-access-w2rf9\") pod \"auto-csr-approver-29562490-qm8k9\" (UID: \"2a1431b3-c664-40ff-971b-d336e91cc3c8\") " pod="openshift-infra/auto-csr-approver-29562490-qm8k9" Mar 17 12:10:00 crc kubenswrapper[4742]: I0317 12:10:00.492872 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562490-qm8k9" Mar 17 12:10:01 crc kubenswrapper[4742]: I0317 12:10:01.008110 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562490-qm8k9"] Mar 17 12:10:01 crc kubenswrapper[4742]: I0317 12:10:01.017893 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 12:10:01 crc kubenswrapper[4742]: I0317 12:10:01.859875 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562490-qm8k9" event={"ID":"2a1431b3-c664-40ff-971b-d336e91cc3c8","Type":"ContainerStarted","Data":"b6bdf22a5fc716e2515d3c55387375bee4aad16df57dc18eedbbbd68b84e2f52"} Mar 17 12:10:02 crc kubenswrapper[4742]: I0317 12:10:02.873610 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562490-qm8k9" event={"ID":"2a1431b3-c664-40ff-971b-d336e91cc3c8","Type":"ContainerStarted","Data":"8bc4783cbc6cf0e40f8359efe811d99c58b3e56f7b6aaff9ff4c505675166502"} Mar 17 12:10:02 crc kubenswrapper[4742]: I0317 12:10:02.899357 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562490-qm8k9" podStartSLOduration=1.527100707 podStartE2EDuration="2.89933121s" podCreationTimestamp="2026-03-17 12:10:00 +0000 UTC" firstStartedPulling="2026-03-17 12:10:01.01765831 +0000 UTC m=+3504.143786068" lastFinishedPulling="2026-03-17 12:10:02.389888813 +0000 UTC m=+3505.516016571" observedRunningTime="2026-03-17 12:10:02.894891709 +0000 UTC m=+3506.021019477" watchObservedRunningTime="2026-03-17 12:10:02.89933121 +0000 UTC m=+3506.025458988" Mar 17 12:10:03 crc kubenswrapper[4742]: I0317 12:10:03.885902 4742 generic.go:334] "Generic (PLEG): container finished" podID="2a1431b3-c664-40ff-971b-d336e91cc3c8" containerID="8bc4783cbc6cf0e40f8359efe811d99c58b3e56f7b6aaff9ff4c505675166502" exitCode=0 Mar 17 12:10:03 crc kubenswrapper[4742]: I0317 12:10:03.886035 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562490-qm8k9" event={"ID":"2a1431b3-c664-40ff-971b-d336e91cc3c8","Type":"ContainerDied","Data":"8bc4783cbc6cf0e40f8359efe811d99c58b3e56f7b6aaff9ff4c505675166502"} Mar 17 12:10:05 crc kubenswrapper[4742]: I0317 12:10:05.258235 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562490-qm8k9" Mar 17 12:10:05 crc kubenswrapper[4742]: I0317 12:10:05.415146 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2rf9\" (UniqueName: \"kubernetes.io/projected/2a1431b3-c664-40ff-971b-d336e91cc3c8-kube-api-access-w2rf9\") pod \"2a1431b3-c664-40ff-971b-d336e91cc3c8\" (UID: \"2a1431b3-c664-40ff-971b-d336e91cc3c8\") " Mar 17 12:10:05 crc kubenswrapper[4742]: I0317 12:10:05.423225 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1431b3-c664-40ff-971b-d336e91cc3c8-kube-api-access-w2rf9" (OuterVolumeSpecName: "kube-api-access-w2rf9") pod "2a1431b3-c664-40ff-971b-d336e91cc3c8" (UID: "2a1431b3-c664-40ff-971b-d336e91cc3c8"). InnerVolumeSpecName "kube-api-access-w2rf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:10:05 crc kubenswrapper[4742]: I0317 12:10:05.517514 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2rf9\" (UniqueName: \"kubernetes.io/projected/2a1431b3-c664-40ff-971b-d336e91cc3c8-kube-api-access-w2rf9\") on node \"crc\" DevicePath \"\"" Mar 17 12:10:05 crc kubenswrapper[4742]: I0317 12:10:05.914772 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562490-qm8k9" event={"ID":"2a1431b3-c664-40ff-971b-d336e91cc3c8","Type":"ContainerDied","Data":"b6bdf22a5fc716e2515d3c55387375bee4aad16df57dc18eedbbbd68b84e2f52"} Mar 17 12:10:05 crc kubenswrapper[4742]: I0317 12:10:05.914825 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6bdf22a5fc716e2515d3c55387375bee4aad16df57dc18eedbbbd68b84e2f52" Mar 17 12:10:05 crc kubenswrapper[4742]: I0317 12:10:05.914873 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562490-qm8k9" Mar 17 12:10:05 crc kubenswrapper[4742]: I0317 12:10:05.970563 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562484-w74rx"] Mar 17 12:10:05 crc kubenswrapper[4742]: I0317 12:10:05.983053 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562484-w74rx"] Mar 17 12:10:06 crc kubenswrapper[4742]: I0317 12:10:06.674584 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0cc7105-8840-46be-9cf2-a52080296716" path="/var/lib/kubelet/pods/d0cc7105-8840-46be-9cf2-a52080296716/volumes" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.256456 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c2hjn"] Mar 17 12:10:27 crc kubenswrapper[4742]: E0317 12:10:27.258298 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1431b3-c664-40ff-971b-d336e91cc3c8" containerName="oc" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.258336 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1431b3-c664-40ff-971b-d336e91cc3c8" containerName="oc" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.258870 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1431b3-c664-40ff-971b-d336e91cc3c8" containerName="oc" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.261812 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.286763 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2hjn"] Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.399559 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-catalog-content\") pod \"redhat-operators-c2hjn\" (UID: \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\") " pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.399640 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-utilities\") pod \"redhat-operators-c2hjn\" (UID: \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\") " pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.399733 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt7rk\" (UniqueName: \"kubernetes.io/projected/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-kube-api-access-kt7rk\") pod \"redhat-operators-c2hjn\" (UID: \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\") " pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.501059 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt7rk\" (UniqueName: \"kubernetes.io/projected/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-kube-api-access-kt7rk\") pod \"redhat-operators-c2hjn\" (UID: \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\") " pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.501172 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-catalog-content\") pod \"redhat-operators-c2hjn\" (UID: \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\") " pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.501236 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-utilities\") pod \"redhat-operators-c2hjn\" (UID: \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\") " pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.501804 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-utilities\") pod \"redhat-operators-c2hjn\" (UID: \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\") " pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.501976 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-catalog-content\") pod \"redhat-operators-c2hjn\" (UID: \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\") " pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.526984 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt7rk\" (UniqueName: \"kubernetes.io/projected/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-kube-api-access-kt7rk\") pod \"redhat-operators-c2hjn\" (UID: \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\") " pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:27 crc kubenswrapper[4742]: I0317 12:10:27.629294 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:28 crc kubenswrapper[4742]: W0317 12:10:28.195745 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b0ae19f_e24c_45d1_8b4b_4469b97ceba8.slice/crio-a504bd3cba19a73c83084359e6a1b442d56ce1135f4000d4cf00d9c573bcb21e WatchSource:0}: Error finding container a504bd3cba19a73c83084359e6a1b442d56ce1135f4000d4cf00d9c573bcb21e: Status 404 returned error can't find the container with id a504bd3cba19a73c83084359e6a1b442d56ce1135f4000d4cf00d9c573bcb21e Mar 17 12:10:28 crc kubenswrapper[4742]: I0317 12:10:28.198339 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2hjn"] Mar 17 12:10:29 crc kubenswrapper[4742]: I0317 12:10:29.138168 4742 generic.go:334] "Generic (PLEG): container finished" podID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" containerID="4737db3194af7312585fa794fafbc02e1eb960e5f902c79ede14cb3f45cdf799" exitCode=0 Mar 17 12:10:29 crc kubenswrapper[4742]: I0317 12:10:29.138521 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2hjn" event={"ID":"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8","Type":"ContainerDied","Data":"4737db3194af7312585fa794fafbc02e1eb960e5f902c79ede14cb3f45cdf799"} Mar 17 12:10:29 crc kubenswrapper[4742]: I0317 12:10:29.138551 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2hjn" event={"ID":"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8","Type":"ContainerStarted","Data":"a504bd3cba19a73c83084359e6a1b442d56ce1135f4000d4cf00d9c573bcb21e"} Mar 17 12:10:31 crc kubenswrapper[4742]: I0317 12:10:31.161362 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2hjn" event={"ID":"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8","Type":"ContainerStarted","Data":"848b265a80a2c8d9724b63f7b970a96f5ce1d741c3d733df6cef0f93093b2b0f"} Mar 17 12:10:33 crc kubenswrapper[4742]: I0317 12:10:33.191821 4742 generic.go:334] "Generic (PLEG): container finished" podID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" containerID="848b265a80a2c8d9724b63f7b970a96f5ce1d741c3d733df6cef0f93093b2b0f" exitCode=0 Mar 17 12:10:33 crc kubenswrapper[4742]: I0317 12:10:33.191957 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2hjn" event={"ID":"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8","Type":"ContainerDied","Data":"848b265a80a2c8d9724b63f7b970a96f5ce1d741c3d733df6cef0f93093b2b0f"} Mar 17 12:10:34 crc kubenswrapper[4742]: I0317 12:10:34.209301 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2hjn" event={"ID":"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8","Type":"ContainerStarted","Data":"67a859d56da81f0a50436d68214a89abb7913bac6429ae52e8c3a51d2bebbac7"} Mar 17 12:10:34 crc kubenswrapper[4742]: I0317 12:10:34.231023 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c2hjn" podStartSLOduration=2.725168531 podStartE2EDuration="7.230998051s" podCreationTimestamp="2026-03-17 12:10:27 +0000 UTC" firstStartedPulling="2026-03-17 12:10:29.139751983 +0000 UTC m=+3532.265879751" lastFinishedPulling="2026-03-17 12:10:33.645581513 +0000 UTC m=+3536.771709271" observedRunningTime="2026-03-17 12:10:34.225074331 +0000 UTC m=+3537.351202099" watchObservedRunningTime="2026-03-17 12:10:34.230998051 +0000 UTC m=+3537.357125809" Mar 17 12:10:37 crc kubenswrapper[4742]: I0317 12:10:37.629766 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:37 crc kubenswrapper[4742]: I0317 12:10:37.630345 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:38 crc kubenswrapper[4742]: I0317 12:10:38.684163 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c2hjn" podUID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" containerName="registry-server" probeResult="failure" output=< Mar 17 12:10:38 crc kubenswrapper[4742]: timeout: failed to connect service ":50051" within 1s Mar 17 12:10:38 crc kubenswrapper[4742]: > Mar 17 12:10:48 crc kubenswrapper[4742]: I0317 12:10:48.709146 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c2hjn" podUID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" containerName="registry-server" probeResult="failure" output=< Mar 17 12:10:48 crc kubenswrapper[4742]: timeout: failed to connect service ":50051" within 1s Mar 17 12:10:48 crc kubenswrapper[4742]: > Mar 17 12:10:50 crc kubenswrapper[4742]: I0317 12:10:50.391552 4742 generic.go:334] "Generic (PLEG): container finished" podID="cbe323de-3d55-4905-8f28-29cea959ae35" containerID="4b4bd5b2fc127bd20641300159a13b0689286cc862afe3593d73593d72e88aa3" exitCode=0 Mar 17 12:10:50 crc kubenswrapper[4742]: I0317 12:10:50.392017 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"cbe323de-3d55-4905-8f28-29cea959ae35","Type":"ContainerDied","Data":"4b4bd5b2fc127bd20641300159a13b0689286cc862afe3593d73593d72e88aa3"} Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.755356 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.894834 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cbe323de-3d55-4905-8f28-29cea959ae35-openstack-config\") pod \"cbe323de-3d55-4905-8f28-29cea959ae35\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.895236 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-ca-certs\") pod \"cbe323de-3d55-4905-8f28-29cea959ae35\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.895277 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/cbe323de-3d55-4905-8f28-29cea959ae35-test-operator-ephemeral-temporary\") pod \"cbe323de-3d55-4905-8f28-29cea959ae35\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.895305 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/cbe323de-3d55-4905-8f28-29cea959ae35-test-operator-ephemeral-workdir\") pod \"cbe323de-3d55-4905-8f28-29cea959ae35\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.895333 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbe323de-3d55-4905-8f28-29cea959ae35-config-data\") pod \"cbe323de-3d55-4905-8f28-29cea959ae35\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.895402 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9npq\" (UniqueName: \"kubernetes.io/projected/cbe323de-3d55-4905-8f28-29cea959ae35-kube-api-access-l9npq\") pod \"cbe323de-3d55-4905-8f28-29cea959ae35\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.895546 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-ssh-key\") pod \"cbe323de-3d55-4905-8f28-29cea959ae35\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.895569 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cbe323de-3d55-4905-8f28-29cea959ae35\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.895622 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-openstack-config-secret\") pod \"cbe323de-3d55-4905-8f28-29cea959ae35\" (UID: \"cbe323de-3d55-4905-8f28-29cea959ae35\") " Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.897387 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe323de-3d55-4905-8f28-29cea959ae35-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "cbe323de-3d55-4905-8f28-29cea959ae35" (UID: "cbe323de-3d55-4905-8f28-29cea959ae35"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.898223 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe323de-3d55-4905-8f28-29cea959ae35-config-data" (OuterVolumeSpecName: "config-data") pod "cbe323de-3d55-4905-8f28-29cea959ae35" (UID: "cbe323de-3d55-4905-8f28-29cea959ae35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.900466 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "cbe323de-3d55-4905-8f28-29cea959ae35" (UID: "cbe323de-3d55-4905-8f28-29cea959ae35"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.901096 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe323de-3d55-4905-8f28-29cea959ae35-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "cbe323de-3d55-4905-8f28-29cea959ae35" (UID: "cbe323de-3d55-4905-8f28-29cea959ae35"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.902062 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe323de-3d55-4905-8f28-29cea959ae35-kube-api-access-l9npq" (OuterVolumeSpecName: "kube-api-access-l9npq") pod "cbe323de-3d55-4905-8f28-29cea959ae35" (UID: "cbe323de-3d55-4905-8f28-29cea959ae35"). InnerVolumeSpecName "kube-api-access-l9npq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.926675 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "cbe323de-3d55-4905-8f28-29cea959ae35" (UID: "cbe323de-3d55-4905-8f28-29cea959ae35"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.929169 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "cbe323de-3d55-4905-8f28-29cea959ae35" (UID: "cbe323de-3d55-4905-8f28-29cea959ae35"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.929675 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cbe323de-3d55-4905-8f28-29cea959ae35" (UID: "cbe323de-3d55-4905-8f28-29cea959ae35"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.948889 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbe323de-3d55-4905-8f28-29cea959ae35-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "cbe323de-3d55-4905-8f28-29cea959ae35" (UID: "cbe323de-3d55-4905-8f28-29cea959ae35"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.998106 4742 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/cbe323de-3d55-4905-8f28-29cea959ae35-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.998135 4742 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.998146 4742 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/cbe323de-3d55-4905-8f28-29cea959ae35-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.998159 4742 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/cbe323de-3d55-4905-8f28-29cea959ae35-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.998168 4742 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbe323de-3d55-4905-8f28-29cea959ae35-config-data\") on node \"crc\" DevicePath \"\"" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.998176 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9npq\" (UniqueName: \"kubernetes.io/projected/cbe323de-3d55-4905-8f28-29cea959ae35-kube-api-access-l9npq\") on node \"crc\" DevicePath \"\"" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.998187 4742 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.998220 4742 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 17 12:10:51 crc kubenswrapper[4742]: I0317 12:10:51.998229 4742 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/cbe323de-3d55-4905-8f28-29cea959ae35-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 17 12:10:52 crc kubenswrapper[4742]: I0317 12:10:52.019341 4742 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 17 12:10:52 crc kubenswrapper[4742]: I0317 12:10:52.099389 4742 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 17 12:10:52 crc kubenswrapper[4742]: I0317 12:10:52.410150 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"cbe323de-3d55-4905-8f28-29cea959ae35","Type":"ContainerDied","Data":"be5d1e3fa7693f154bb5cce1fe0b1971babe4f088c8cd497cd7163100b9d0a60"} Mar 17 12:10:52 crc kubenswrapper[4742]: I0317 12:10:52.410191 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be5d1e3fa7693f154bb5cce1fe0b1971babe4f088c8cd497cd7163100b9d0a60" Mar 17 12:10:52 crc kubenswrapper[4742]: I0317 12:10:52.410207 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 17 12:10:57 crc kubenswrapper[4742]: I0317 12:10:57.191758 4742 scope.go:117] "RemoveContainer" containerID="ea61e793723da8f929adfd34c5a03b15fadd8b901ade8d0ae8e71b3824e9973d" Mar 17 12:10:57 crc kubenswrapper[4742]: I0317 12:10:57.257756 4742 scope.go:117] "RemoveContainer" containerID="086446d56dcf58182e48e9e2d1cd8177824862bae41e66202a706759ac22b546" Mar 17 12:10:57 crc kubenswrapper[4742]: I0317 12:10:57.295990 4742 scope.go:117] "RemoveContainer" containerID="e79a3f52c5022e2d2072115c98c9cc4b8f65de1ed18d6b766e4ff3de81058cd4" Mar 17 12:10:57 crc kubenswrapper[4742]: I0317 12:10:57.322842 4742 scope.go:117] "RemoveContainer" containerID="8ea0131e34dbbe47e0c267836570905ef59c3fd0ff460007891d9ae840542e6e" Mar 17 12:10:57 crc kubenswrapper[4742]: I0317 12:10:57.715121 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:57 crc kubenswrapper[4742]: I0317 12:10:57.770047 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:10:58 crc kubenswrapper[4742]: I0317 12:10:58.447952 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2hjn"] Mar 17 12:10:59 crc kubenswrapper[4742]: I0317 12:10:59.476033 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c2hjn" podUID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" containerName="registry-server" containerID="cri-o://67a859d56da81f0a50436d68214a89abb7913bac6429ae52e8c3a51d2bebbac7" gracePeriod=2 Mar 17 12:10:59 crc kubenswrapper[4742]: I0317 12:10:59.968617 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.095670 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt7rk\" (UniqueName: \"kubernetes.io/projected/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-kube-api-access-kt7rk\") pod \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\" (UID: \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\") " Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.095750 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-utilities\") pod \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\" (UID: \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\") " Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.095938 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-catalog-content\") pod \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\" (UID: \"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8\") " Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.097241 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-utilities" (OuterVolumeSpecName: "utilities") pod "5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" (UID: "5b0ae19f-e24c-45d1-8b4b-4469b97ceba8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.107164 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-kube-api-access-kt7rk" (OuterVolumeSpecName: "kube-api-access-kt7rk") pod "5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" (UID: "5b0ae19f-e24c-45d1-8b4b-4469b97ceba8"). InnerVolumeSpecName "kube-api-access-kt7rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.205057 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt7rk\" (UniqueName: \"kubernetes.io/projected/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-kube-api-access-kt7rk\") on node \"crc\" DevicePath \"\"" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.205110 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.282846 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" (UID: "5b0ae19f-e24c-45d1-8b4b-4469b97ceba8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.306243 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.492100 4742 generic.go:334] "Generic (PLEG): container finished" podID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" containerID="67a859d56da81f0a50436d68214a89abb7913bac6429ae52e8c3a51d2bebbac7" exitCode=0 Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.492163 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2hjn" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.492170 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2hjn" event={"ID":"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8","Type":"ContainerDied","Data":"67a859d56da81f0a50436d68214a89abb7913bac6429ae52e8c3a51d2bebbac7"} Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.492312 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2hjn" event={"ID":"5b0ae19f-e24c-45d1-8b4b-4469b97ceba8","Type":"ContainerDied","Data":"a504bd3cba19a73c83084359e6a1b442d56ce1135f4000d4cf00d9c573bcb21e"} Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.492347 4742 scope.go:117] "RemoveContainer" containerID="67a859d56da81f0a50436d68214a89abb7913bac6429ae52e8c3a51d2bebbac7" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.530400 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2hjn"] Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.538221 4742 scope.go:117] "RemoveContainer" containerID="848b265a80a2c8d9724b63f7b970a96f5ce1d741c3d733df6cef0f93093b2b0f" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.539421 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c2hjn"] Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.571584 4742 scope.go:117] "RemoveContainer" containerID="4737db3194af7312585fa794fafbc02e1eb960e5f902c79ede14cb3f45cdf799" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.623869 4742 scope.go:117] "RemoveContainer" containerID="67a859d56da81f0a50436d68214a89abb7913bac6429ae52e8c3a51d2bebbac7" Mar 17 12:11:00 crc kubenswrapper[4742]: E0317 12:11:00.624476 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67a859d56da81f0a50436d68214a89abb7913bac6429ae52e8c3a51d2bebbac7\": container with ID starting with 67a859d56da81f0a50436d68214a89abb7913bac6429ae52e8c3a51d2bebbac7 not found: ID does not exist" containerID="67a859d56da81f0a50436d68214a89abb7913bac6429ae52e8c3a51d2bebbac7" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.624552 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a859d56da81f0a50436d68214a89abb7913bac6429ae52e8c3a51d2bebbac7"} err="failed to get container status \"67a859d56da81f0a50436d68214a89abb7913bac6429ae52e8c3a51d2bebbac7\": rpc error: code = NotFound desc = could not find container \"67a859d56da81f0a50436d68214a89abb7913bac6429ae52e8c3a51d2bebbac7\": container with ID starting with 67a859d56da81f0a50436d68214a89abb7913bac6429ae52e8c3a51d2bebbac7 not found: ID does not exist" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.624582 4742 scope.go:117] "RemoveContainer" containerID="848b265a80a2c8d9724b63f7b970a96f5ce1d741c3d733df6cef0f93093b2b0f" Mar 17 12:11:00 crc kubenswrapper[4742]: E0317 12:11:00.625036 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"848b265a80a2c8d9724b63f7b970a96f5ce1d741c3d733df6cef0f93093b2b0f\": container with ID starting with 848b265a80a2c8d9724b63f7b970a96f5ce1d741c3d733df6cef0f93093b2b0f not found: ID does not exist" containerID="848b265a80a2c8d9724b63f7b970a96f5ce1d741c3d733df6cef0f93093b2b0f" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.625072 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"848b265a80a2c8d9724b63f7b970a96f5ce1d741c3d733df6cef0f93093b2b0f"} err="failed to get container status \"848b265a80a2c8d9724b63f7b970a96f5ce1d741c3d733df6cef0f93093b2b0f\": rpc error: code = NotFound desc = could not find container \"848b265a80a2c8d9724b63f7b970a96f5ce1d741c3d733df6cef0f93093b2b0f\": container with ID starting with 848b265a80a2c8d9724b63f7b970a96f5ce1d741c3d733df6cef0f93093b2b0f not found: ID does not exist" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.625094 4742 scope.go:117] "RemoveContainer" containerID="4737db3194af7312585fa794fafbc02e1eb960e5f902c79ede14cb3f45cdf799" Mar 17 12:11:00 crc kubenswrapper[4742]: E0317 12:11:00.626029 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4737db3194af7312585fa794fafbc02e1eb960e5f902c79ede14cb3f45cdf799\": container with ID starting with 4737db3194af7312585fa794fafbc02e1eb960e5f902c79ede14cb3f45cdf799 not found: ID does not exist" containerID="4737db3194af7312585fa794fafbc02e1eb960e5f902c79ede14cb3f45cdf799" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.626259 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4737db3194af7312585fa794fafbc02e1eb960e5f902c79ede14cb3f45cdf799"} err="failed to get container status \"4737db3194af7312585fa794fafbc02e1eb960e5f902c79ede14cb3f45cdf799\": rpc error: code = NotFound desc = could not find container \"4737db3194af7312585fa794fafbc02e1eb960e5f902c79ede14cb3f45cdf799\": container with ID starting with 4737db3194af7312585fa794fafbc02e1eb960e5f902c79ede14cb3f45cdf799 not found: ID does not exist" Mar 17 12:11:00 crc kubenswrapper[4742]: I0317 12:11:00.684735 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" path="/var/lib/kubelet/pods/5b0ae19f-e24c-45d1-8b4b-4469b97ceba8/volumes" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.191575 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 17 12:11:02 crc kubenswrapper[4742]: E0317 12:11:02.192191 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe323de-3d55-4905-8f28-29cea959ae35" containerName="tempest-tests-tempest-tests-runner" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.192212 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe323de-3d55-4905-8f28-29cea959ae35" containerName="tempest-tests-tempest-tests-runner" Mar 17 12:11:02 crc kubenswrapper[4742]: E0317 12:11:02.192253 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" containerName="extract-utilities" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.192264 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" containerName="extract-utilities" Mar 17 12:11:02 crc kubenswrapper[4742]: E0317 12:11:02.192283 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" containerName="extract-content" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.192293 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" containerName="extract-content" Mar 17 12:11:02 crc kubenswrapper[4742]: E0317 12:11:02.192325 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" containerName="registry-server" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.192336 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" containerName="registry-server" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.192633 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe323de-3d55-4905-8f28-29cea959ae35" containerName="tempest-tests-tempest-tests-runner" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.192665 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b0ae19f-e24c-45d1-8b4b-4469b97ceba8" containerName="registry-server" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.193709 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.195700 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-sm8hx" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.212502 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.345253 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6bfbc7cf-c913-4297-a60e-307a3829b636\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.345333 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbj94\" (UniqueName: \"kubernetes.io/projected/6bfbc7cf-c913-4297-a60e-307a3829b636-kube-api-access-xbj94\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6bfbc7cf-c913-4297-a60e-307a3829b636\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.447127 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6bfbc7cf-c913-4297-a60e-307a3829b636\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.447198 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbj94\" (UniqueName: \"kubernetes.io/projected/6bfbc7cf-c913-4297-a60e-307a3829b636-kube-api-access-xbj94\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6bfbc7cf-c913-4297-a60e-307a3829b636\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.447871 4742 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6bfbc7cf-c913-4297-a60e-307a3829b636\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.471730 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbj94\" (UniqueName: \"kubernetes.io/projected/6bfbc7cf-c913-4297-a60e-307a3829b636-kube-api-access-xbj94\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6bfbc7cf-c913-4297-a60e-307a3829b636\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.473569 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6bfbc7cf-c913-4297-a60e-307a3829b636\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.528036 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 17 12:11:02 crc kubenswrapper[4742]: I0317 12:11:02.986381 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 17 12:11:03 crc kubenswrapper[4742]: I0317 12:11:03.525583 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6bfbc7cf-c913-4297-a60e-307a3829b636","Type":"ContainerStarted","Data":"7b60bbca6b6848619b59ee27a1f9269f04657d0048b78ebbdc35a5ba504eef9f"} Mar 17 12:11:04 crc kubenswrapper[4742]: I0317 12:11:04.537899 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6bfbc7cf-c913-4297-a60e-307a3829b636","Type":"ContainerStarted","Data":"a91461eb46dc5e21f7e70141041ee9f5e3848bc0ab248a01cb464c900939b6f9"} Mar 17 12:11:04 crc kubenswrapper[4742]: I0317 12:11:04.560561 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.6617056780000001 podStartE2EDuration="2.560536183s" podCreationTimestamp="2026-03-17 12:11:02 +0000 UTC" firstStartedPulling="2026-03-17 12:11:02.997621824 +0000 UTC m=+3566.123749582" lastFinishedPulling="2026-03-17 12:11:03.896452319 +0000 UTC m=+3567.022580087" observedRunningTime="2026-03-17 12:11:04.557354676 +0000 UTC m=+3567.683482524" watchObservedRunningTime="2026-03-17 12:11:04.560536183 +0000 UTC m=+3567.686663981" Mar 17 12:11:18 crc kubenswrapper[4742]: I0317 12:11:18.044951 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:11:18 crc kubenswrapper[4742]: I0317 12:11:18.047451 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:11:30 crc kubenswrapper[4742]: I0317 12:11:30.399142 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mltbx/must-gather-4dhpl"] Mar 17 12:11:30 crc kubenswrapper[4742]: I0317 12:11:30.402521 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/must-gather-4dhpl" Mar 17 12:11:30 crc kubenswrapper[4742]: I0317 12:11:30.405767 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mltbx"/"kube-root-ca.crt" Mar 17 12:11:30 crc kubenswrapper[4742]: I0317 12:11:30.408174 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mltbx"/"openshift-service-ca.crt" Mar 17 12:11:30 crc kubenswrapper[4742]: I0317 12:11:30.445190 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mltbx/must-gather-4dhpl"] Mar 17 12:11:30 crc kubenswrapper[4742]: I0317 12:11:30.463297 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0466a590-af75-4814-9161-b142a0a62674-must-gather-output\") pod \"must-gather-4dhpl\" (UID: \"0466a590-af75-4814-9161-b142a0a62674\") " pod="openshift-must-gather-mltbx/must-gather-4dhpl" Mar 17 12:11:30 crc kubenswrapper[4742]: I0317 12:11:30.463731 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn6pv\" (UniqueName: \"kubernetes.io/projected/0466a590-af75-4814-9161-b142a0a62674-kube-api-access-bn6pv\") pod \"must-gather-4dhpl\" (UID: \"0466a590-af75-4814-9161-b142a0a62674\") " pod="openshift-must-gather-mltbx/must-gather-4dhpl" Mar 17 12:11:30 crc kubenswrapper[4742]: I0317 12:11:30.565187 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn6pv\" (UniqueName: \"kubernetes.io/projected/0466a590-af75-4814-9161-b142a0a62674-kube-api-access-bn6pv\") pod \"must-gather-4dhpl\" (UID: \"0466a590-af75-4814-9161-b142a0a62674\") " pod="openshift-must-gather-mltbx/must-gather-4dhpl" Mar 17 12:11:30 crc kubenswrapper[4742]: I0317 12:11:30.565471 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0466a590-af75-4814-9161-b142a0a62674-must-gather-output\") pod \"must-gather-4dhpl\" (UID: \"0466a590-af75-4814-9161-b142a0a62674\") " pod="openshift-must-gather-mltbx/must-gather-4dhpl" Mar 17 12:11:30 crc kubenswrapper[4742]: I0317 12:11:30.566055 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0466a590-af75-4814-9161-b142a0a62674-must-gather-output\") pod \"must-gather-4dhpl\" (UID: \"0466a590-af75-4814-9161-b142a0a62674\") " pod="openshift-must-gather-mltbx/must-gather-4dhpl" Mar 17 12:11:30 crc kubenswrapper[4742]: I0317 12:11:30.598225 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn6pv\" (UniqueName: \"kubernetes.io/projected/0466a590-af75-4814-9161-b142a0a62674-kube-api-access-bn6pv\") pod \"must-gather-4dhpl\" (UID: \"0466a590-af75-4814-9161-b142a0a62674\") " pod="openshift-must-gather-mltbx/must-gather-4dhpl" Mar 17 12:11:30 crc kubenswrapper[4742]: I0317 12:11:30.729384 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/must-gather-4dhpl" Mar 17 12:11:31 crc kubenswrapper[4742]: I0317 12:11:31.289082 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mltbx/must-gather-4dhpl"] Mar 17 12:11:31 crc kubenswrapper[4742]: I0317 12:11:31.846303 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mltbx/must-gather-4dhpl" event={"ID":"0466a590-af75-4814-9161-b142a0a62674","Type":"ContainerStarted","Data":"d13bb5dc979eee2f7f618e20e5a7c1ac69b746be8a989c5596f252fa2bd4086d"} Mar 17 12:11:37 crc kubenswrapper[4742]: I0317 12:11:37.915612 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mltbx/must-gather-4dhpl" event={"ID":"0466a590-af75-4814-9161-b142a0a62674","Type":"ContainerStarted","Data":"792c659793073ca4d0a8cfa615c1d573b2c8910eda00d37820f430d07c4bf6a8"} Mar 17 12:11:37 crc kubenswrapper[4742]: I0317 12:11:37.916286 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mltbx/must-gather-4dhpl" event={"ID":"0466a590-af75-4814-9161-b142a0a62674","Type":"ContainerStarted","Data":"c7472c9be7848bfef7e6a7577f50ce3a8d41e6cf0dc4bb17660210bf13db88e7"} Mar 17 12:11:37 crc kubenswrapper[4742]: I0317 12:11:37.940801 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mltbx/must-gather-4dhpl" podStartSLOduration=1.88046731 podStartE2EDuration="7.940780248s" podCreationTimestamp="2026-03-17 12:11:30 +0000 UTC" firstStartedPulling="2026-03-17 12:11:31.304158028 +0000 UTC m=+3594.430285826" lastFinishedPulling="2026-03-17 12:11:37.364470996 +0000 UTC m=+3600.490598764" observedRunningTime="2026-03-17 12:11:37.935005291 +0000 UTC m=+3601.061133069" watchObservedRunningTime="2026-03-17 12:11:37.940780248 +0000 UTC m=+3601.066908026" Mar 17 12:11:41 crc kubenswrapper[4742]: I0317 12:11:41.206297 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mltbx/crc-debug-jsjq5"] Mar 17 12:11:41 crc kubenswrapper[4742]: I0317 12:11:41.213069 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/crc-debug-jsjq5" Mar 17 12:11:41 crc kubenswrapper[4742]: I0317 12:11:41.215043 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mltbx"/"default-dockercfg-rd4mc" Mar 17 12:11:41 crc kubenswrapper[4742]: I0317 12:11:41.396128 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brfqj\" (UniqueName: \"kubernetes.io/projected/7e3db6d8-ce89-4d0b-8491-91cd81d476a3-kube-api-access-brfqj\") pod \"crc-debug-jsjq5\" (UID: \"7e3db6d8-ce89-4d0b-8491-91cd81d476a3\") " pod="openshift-must-gather-mltbx/crc-debug-jsjq5" Mar 17 12:11:41 crc kubenswrapper[4742]: I0317 12:11:41.396429 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e3db6d8-ce89-4d0b-8491-91cd81d476a3-host\") pod \"crc-debug-jsjq5\" (UID: \"7e3db6d8-ce89-4d0b-8491-91cd81d476a3\") " pod="openshift-must-gather-mltbx/crc-debug-jsjq5" Mar 17 12:11:41 crc kubenswrapper[4742]: I0317 12:11:41.498807 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brfqj\" (UniqueName: \"kubernetes.io/projected/7e3db6d8-ce89-4d0b-8491-91cd81d476a3-kube-api-access-brfqj\") pod \"crc-debug-jsjq5\" (UID: \"7e3db6d8-ce89-4d0b-8491-91cd81d476a3\") " pod="openshift-must-gather-mltbx/crc-debug-jsjq5" Mar 17 12:11:41 crc kubenswrapper[4742]: I0317 12:11:41.498953 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e3db6d8-ce89-4d0b-8491-91cd81d476a3-host\") pod \"crc-debug-jsjq5\" (UID: \"7e3db6d8-ce89-4d0b-8491-91cd81d476a3\") " pod="openshift-must-gather-mltbx/crc-debug-jsjq5" Mar 17 12:11:41 crc kubenswrapper[4742]: I0317 12:11:41.499185 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e3db6d8-ce89-4d0b-8491-91cd81d476a3-host\") pod \"crc-debug-jsjq5\" (UID: \"7e3db6d8-ce89-4d0b-8491-91cd81d476a3\") " pod="openshift-must-gather-mltbx/crc-debug-jsjq5" Mar 17 12:11:41 crc kubenswrapper[4742]: I0317 12:11:41.517010 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brfqj\" (UniqueName: \"kubernetes.io/projected/7e3db6d8-ce89-4d0b-8491-91cd81d476a3-kube-api-access-brfqj\") pod \"crc-debug-jsjq5\" (UID: \"7e3db6d8-ce89-4d0b-8491-91cd81d476a3\") " pod="openshift-must-gather-mltbx/crc-debug-jsjq5" Mar 17 12:11:41 crc kubenswrapper[4742]: I0317 12:11:41.531670 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/crc-debug-jsjq5" Mar 17 12:11:41 crc kubenswrapper[4742]: W0317 12:11:41.573626 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e3db6d8_ce89_4d0b_8491_91cd81d476a3.slice/crio-d80189e6ea459109a6096852da5bc6880055146c94d08be55c6272a59428b8bc WatchSource:0}: Error finding container d80189e6ea459109a6096852da5bc6880055146c94d08be55c6272a59428b8bc: Status 404 returned error can't find the container with id d80189e6ea459109a6096852da5bc6880055146c94d08be55c6272a59428b8bc Mar 17 12:11:41 crc kubenswrapper[4742]: I0317 12:11:41.952877 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mltbx/crc-debug-jsjq5" event={"ID":"7e3db6d8-ce89-4d0b-8491-91cd81d476a3","Type":"ContainerStarted","Data":"d80189e6ea459109a6096852da5bc6880055146c94d08be55c6272a59428b8bc"} Mar 17 12:11:48 crc kubenswrapper[4742]: I0317 12:11:48.044611 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:11:48 crc kubenswrapper[4742]: I0317 12:11:48.045211 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:11:53 crc kubenswrapper[4742]: I0317 12:11:53.067487 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mltbx/crc-debug-jsjq5" event={"ID":"7e3db6d8-ce89-4d0b-8491-91cd81d476a3","Type":"ContainerStarted","Data":"72e91e07454c6d2075a2193a1d0b3014df3c7c4b731f73fd56151126ba1f7be7"} Mar 17 12:11:53 crc kubenswrapper[4742]: I0317 12:11:53.085416 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mltbx/crc-debug-jsjq5" podStartSLOduration=0.80998149 podStartE2EDuration="12.085397137s" podCreationTimestamp="2026-03-17 12:11:41 +0000 UTC" firstStartedPulling="2026-03-17 12:11:41.576165374 +0000 UTC m=+3604.702293152" lastFinishedPulling="2026-03-17 12:11:52.851581031 +0000 UTC m=+3615.977708799" observedRunningTime="2026-03-17 12:11:53.080948326 +0000 UTC m=+3616.207076084" watchObservedRunningTime="2026-03-17 12:11:53.085397137 +0000 UTC m=+3616.211524895" Mar 17 12:12:00 crc kubenswrapper[4742]: I0317 12:12:00.161893 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562492-cvvvl"] Mar 17 12:12:00 crc kubenswrapper[4742]: I0317 12:12:00.165337 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562492-cvvvl" Mar 17 12:12:00 crc kubenswrapper[4742]: I0317 12:12:00.168424 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:12:00 crc kubenswrapper[4742]: I0317 12:12:00.176187 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562492-cvvvl"] Mar 17 12:12:00 crc kubenswrapper[4742]: I0317 12:12:00.183117 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:12:00 crc kubenswrapper[4742]: I0317 12:12:00.183245 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:12:00 crc kubenswrapper[4742]: I0317 12:12:00.276394 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8dqf\" (UniqueName: \"kubernetes.io/projected/21c511e2-9f3d-45fd-a671-37a745506f9b-kube-api-access-k8dqf\") pod \"auto-csr-approver-29562492-cvvvl\" (UID: \"21c511e2-9f3d-45fd-a671-37a745506f9b\") " pod="openshift-infra/auto-csr-approver-29562492-cvvvl" Mar 17 12:12:00 crc kubenswrapper[4742]: I0317 12:12:00.378000 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8dqf\" (UniqueName: \"kubernetes.io/projected/21c511e2-9f3d-45fd-a671-37a745506f9b-kube-api-access-k8dqf\") pod \"auto-csr-approver-29562492-cvvvl\" (UID: \"21c511e2-9f3d-45fd-a671-37a745506f9b\") " pod="openshift-infra/auto-csr-approver-29562492-cvvvl" Mar 17 12:12:00 crc kubenswrapper[4742]: I0317 12:12:00.401882 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8dqf\" (UniqueName: \"kubernetes.io/projected/21c511e2-9f3d-45fd-a671-37a745506f9b-kube-api-access-k8dqf\") pod \"auto-csr-approver-29562492-cvvvl\" (UID: \"21c511e2-9f3d-45fd-a671-37a745506f9b\") " pod="openshift-infra/auto-csr-approver-29562492-cvvvl" Mar 17 12:12:00 crc kubenswrapper[4742]: I0317 12:12:00.482805 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562492-cvvvl" Mar 17 12:12:00 crc kubenswrapper[4742]: I0317 12:12:00.956510 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562492-cvvvl"] Mar 17 12:12:01 crc kubenswrapper[4742]: I0317 12:12:01.133337 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562492-cvvvl" event={"ID":"21c511e2-9f3d-45fd-a671-37a745506f9b","Type":"ContainerStarted","Data":"738cd735815183692c1316210475e311c1c31406906008a912881fe9fde3a053"} Mar 17 12:12:03 crc kubenswrapper[4742]: I0317 12:12:03.181267 4742 generic.go:334] "Generic (PLEG): container finished" podID="21c511e2-9f3d-45fd-a671-37a745506f9b" containerID="b79317541ea8ef8e901e52b58f90577e9725970beec390528d1058e09e7700d8" exitCode=0 Mar 17 12:12:03 crc kubenswrapper[4742]: I0317 12:12:03.181827 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562492-cvvvl" event={"ID":"21c511e2-9f3d-45fd-a671-37a745506f9b","Type":"ContainerDied","Data":"b79317541ea8ef8e901e52b58f90577e9725970beec390528d1058e09e7700d8"} Mar 17 12:12:05 crc kubenswrapper[4742]: I0317 12:12:05.201877 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562492-cvvvl" event={"ID":"21c511e2-9f3d-45fd-a671-37a745506f9b","Type":"ContainerDied","Data":"738cd735815183692c1316210475e311c1c31406906008a912881fe9fde3a053"} Mar 17 12:12:05 crc kubenswrapper[4742]: I0317 12:12:05.203517 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="738cd735815183692c1316210475e311c1c31406906008a912881fe9fde3a053" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.365089 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6wt87"] Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.368711 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.382271 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6wt87"] Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.414874 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562492-cvvvl" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.484039 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8dqf\" (UniqueName: \"kubernetes.io/projected/21c511e2-9f3d-45fd-a671-37a745506f9b-kube-api-access-k8dqf\") pod \"21c511e2-9f3d-45fd-a671-37a745506f9b\" (UID: \"21c511e2-9f3d-45fd-a671-37a745506f9b\") " Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.484513 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-utilities\") pod \"certified-operators-6wt87\" (UID: \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\") " pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.484550 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-catalog-content\") pod \"certified-operators-6wt87\" (UID: \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\") " pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.484659 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9hrd\" (UniqueName: \"kubernetes.io/projected/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-kube-api-access-p9hrd\") pod \"certified-operators-6wt87\" (UID: \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\") " pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.504256 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c511e2-9f3d-45fd-a671-37a745506f9b-kube-api-access-k8dqf" (OuterVolumeSpecName: "kube-api-access-k8dqf") pod "21c511e2-9f3d-45fd-a671-37a745506f9b" (UID: "21c511e2-9f3d-45fd-a671-37a745506f9b"). InnerVolumeSpecName "kube-api-access-k8dqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.587641 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9hrd\" (UniqueName: \"kubernetes.io/projected/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-kube-api-access-p9hrd\") pod \"certified-operators-6wt87\" (UID: \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\") " pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.588215 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-utilities\") pod \"certified-operators-6wt87\" (UID: \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\") " pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.588241 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-catalog-content\") pod \"certified-operators-6wt87\" (UID: \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\") " pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.588356 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8dqf\" (UniqueName: \"kubernetes.io/projected/21c511e2-9f3d-45fd-a671-37a745506f9b-kube-api-access-k8dqf\") on node \"crc\" DevicePath \"\"" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.588814 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-catalog-content\") pod \"certified-operators-6wt87\" (UID: \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\") " pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.589272 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-utilities\") pod \"certified-operators-6wt87\" (UID: \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\") " pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.615362 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9hrd\" (UniqueName: \"kubernetes.io/projected/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-kube-api-access-p9hrd\") pod \"certified-operators-6wt87\" (UID: \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\") " pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:06 crc kubenswrapper[4742]: I0317 12:12:06.724594 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:07 crc kubenswrapper[4742]: I0317 12:12:07.218960 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562492-cvvvl" Mar 17 12:12:07 crc kubenswrapper[4742]: I0317 12:12:07.295088 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6wt87"] Mar 17 12:12:07 crc kubenswrapper[4742]: W0317 12:12:07.315397 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf95c5fcf_8d03_4a13_a8df_b86575a3b13e.slice/crio-913a793c3449c47ca33bba1199d21cbeed9d675f0ae090846a4152f531a62921 WatchSource:0}: Error finding container 913a793c3449c47ca33bba1199d21cbeed9d675f0ae090846a4152f531a62921: Status 404 returned error can't find the container with id 913a793c3449c47ca33bba1199d21cbeed9d675f0ae090846a4152f531a62921 Mar 17 12:12:07 crc kubenswrapper[4742]: I0317 12:12:07.483379 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562486-hrrqx"] Mar 17 12:12:07 crc kubenswrapper[4742]: I0317 12:12:07.491981 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562486-hrrqx"] Mar 17 12:12:08 crc kubenswrapper[4742]: I0317 12:12:08.229852 4742 generic.go:334] "Generic (PLEG): container finished" podID="f95c5fcf-8d03-4a13-a8df-b86575a3b13e" containerID="fcb07cbaeb17ae1bded1123479ecb3926a6d0a3101aa37a5d3db286759e8be94" exitCode=0 Mar 17 12:12:08 crc kubenswrapper[4742]: I0317 12:12:08.230122 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wt87" event={"ID":"f95c5fcf-8d03-4a13-a8df-b86575a3b13e","Type":"ContainerDied","Data":"fcb07cbaeb17ae1bded1123479ecb3926a6d0a3101aa37a5d3db286759e8be94"} Mar 17 12:12:08 crc kubenswrapper[4742]: I0317 12:12:08.230314 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wt87" event={"ID":"f95c5fcf-8d03-4a13-a8df-b86575a3b13e","Type":"ContainerStarted","Data":"913a793c3449c47ca33bba1199d21cbeed9d675f0ae090846a4152f531a62921"} Mar 17 12:12:08 crc kubenswrapper[4742]: I0317 12:12:08.672691 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ab30c1-6a19-43e3-b6c1-fd5003e27c33" path="/var/lib/kubelet/pods/a5ab30c1-6a19-43e3-b6c1-fd5003e27c33/volumes" Mar 17 12:12:09 crc kubenswrapper[4742]: I0317 12:12:09.243774 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wt87" event={"ID":"f95c5fcf-8d03-4a13-a8df-b86575a3b13e","Type":"ContainerStarted","Data":"1d091bcef656dd3ebddb32269857cd0cbb5b74e5e1d2213a9bc9b5964da461d3"} Mar 17 12:12:10 crc kubenswrapper[4742]: I0317 12:12:10.254206 4742 generic.go:334] "Generic (PLEG): container finished" podID="f95c5fcf-8d03-4a13-a8df-b86575a3b13e" containerID="1d091bcef656dd3ebddb32269857cd0cbb5b74e5e1d2213a9bc9b5964da461d3" exitCode=0 Mar 17 12:12:10 crc kubenswrapper[4742]: I0317 12:12:10.254443 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wt87" event={"ID":"f95c5fcf-8d03-4a13-a8df-b86575a3b13e","Type":"ContainerDied","Data":"1d091bcef656dd3ebddb32269857cd0cbb5b74e5e1d2213a9bc9b5964da461d3"} Mar 17 12:12:11 crc kubenswrapper[4742]: I0317 12:12:11.300531 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wt87" event={"ID":"f95c5fcf-8d03-4a13-a8df-b86575a3b13e","Type":"ContainerStarted","Data":"0eaf619d5b92bcbf5067687b4b86e4ee01354fff4f29401219793ddf8dc0abad"} Mar 17 12:12:11 crc kubenswrapper[4742]: I0317 12:12:11.318751 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6wt87" podStartSLOduration=2.874603982 podStartE2EDuration="5.318736095s" podCreationTimestamp="2026-03-17 12:12:06 +0000 UTC" firstStartedPulling="2026-03-17 12:12:08.232627608 +0000 UTC m=+3631.358755366" lastFinishedPulling="2026-03-17 12:12:10.676759721 +0000 UTC m=+3633.802887479" observedRunningTime="2026-03-17 12:12:11.316719091 +0000 UTC m=+3634.442846849" watchObservedRunningTime="2026-03-17 12:12:11.318736095 +0000 UTC m=+3634.444863853" Mar 17 12:12:16 crc kubenswrapper[4742]: I0317 12:12:16.725367 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:16 crc kubenswrapper[4742]: I0317 12:12:16.726002 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:16 crc kubenswrapper[4742]: I0317 12:12:16.776945 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:17 crc kubenswrapper[4742]: I0317 12:12:17.415709 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:17 crc kubenswrapper[4742]: I0317 12:12:17.468650 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6wt87"] Mar 17 12:12:18 crc kubenswrapper[4742]: I0317 12:12:18.043629 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:12:18 crc kubenswrapper[4742]: I0317 12:12:18.043688 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:12:18 crc kubenswrapper[4742]: I0317 12:12:18.043736 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 12:12:18 crc kubenswrapper[4742]: I0317 12:12:18.044495 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 12:12:18 crc kubenswrapper[4742]: I0317 12:12:18.044548 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" gracePeriod=600 Mar 17 12:12:18 crc kubenswrapper[4742]: E0317 12:12:18.175450 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:12:18 crc kubenswrapper[4742]: I0317 12:12:18.374774 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" exitCode=0 Mar 17 12:12:18 crc kubenswrapper[4742]: I0317 12:12:18.374860 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f"} Mar 17 12:12:18 crc kubenswrapper[4742]: I0317 12:12:18.374942 4742 scope.go:117] "RemoveContainer" containerID="5629d3aafca0c84011d832a91e511d2d987d5f8c9f7e5232caeaae344d4b0a53" Mar 17 12:12:18 crc kubenswrapper[4742]: I0317 12:12:18.376132 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:12:18 crc kubenswrapper[4742]: E0317 12:12:18.376568 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:12:19 crc kubenswrapper[4742]: I0317 12:12:19.383660 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6wt87" podUID="f95c5fcf-8d03-4a13-a8df-b86575a3b13e" containerName="registry-server" containerID="cri-o://0eaf619d5b92bcbf5067687b4b86e4ee01354fff4f29401219793ddf8dc0abad" gracePeriod=2 Mar 17 12:12:19 crc kubenswrapper[4742]: I0317 12:12:19.816497 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:19 crc kubenswrapper[4742]: I0317 12:12:19.973236 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9hrd\" (UniqueName: \"kubernetes.io/projected/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-kube-api-access-p9hrd\") pod \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\" (UID: \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\") " Mar 17 12:12:19 crc kubenswrapper[4742]: I0317 12:12:19.973624 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-utilities\") pod \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\" (UID: \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\") " Mar 17 12:12:19 crc kubenswrapper[4742]: I0317 12:12:19.973742 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-catalog-content\") pod \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\" (UID: \"f95c5fcf-8d03-4a13-a8df-b86575a3b13e\") " Mar 17 12:12:19 crc kubenswrapper[4742]: I0317 12:12:19.974171 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-utilities" (OuterVolumeSpecName: "utilities") pod "f95c5fcf-8d03-4a13-a8df-b86575a3b13e" (UID: "f95c5fcf-8d03-4a13-a8df-b86575a3b13e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:12:19 crc kubenswrapper[4742]: I0317 12:12:19.979148 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-kube-api-access-p9hrd" (OuterVolumeSpecName: "kube-api-access-p9hrd") pod "f95c5fcf-8d03-4a13-a8df-b86575a3b13e" (UID: "f95c5fcf-8d03-4a13-a8df-b86575a3b13e"). InnerVolumeSpecName "kube-api-access-p9hrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.027835 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f95c5fcf-8d03-4a13-a8df-b86575a3b13e" (UID: "f95c5fcf-8d03-4a13-a8df-b86575a3b13e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.075765 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9hrd\" (UniqueName: \"kubernetes.io/projected/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-kube-api-access-p9hrd\") on node \"crc\" DevicePath \"\"" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.075811 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.075826 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f95c5fcf-8d03-4a13-a8df-b86575a3b13e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.400543 4742 generic.go:334] "Generic (PLEG): container finished" podID="f95c5fcf-8d03-4a13-a8df-b86575a3b13e" containerID="0eaf619d5b92bcbf5067687b4b86e4ee01354fff4f29401219793ddf8dc0abad" exitCode=0 Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.400617 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wt87" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.400626 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wt87" event={"ID":"f95c5fcf-8d03-4a13-a8df-b86575a3b13e","Type":"ContainerDied","Data":"0eaf619d5b92bcbf5067687b4b86e4ee01354fff4f29401219793ddf8dc0abad"} Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.400734 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wt87" event={"ID":"f95c5fcf-8d03-4a13-a8df-b86575a3b13e","Type":"ContainerDied","Data":"913a793c3449c47ca33bba1199d21cbeed9d675f0ae090846a4152f531a62921"} Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.400787 4742 scope.go:117] "RemoveContainer" containerID="0eaf619d5b92bcbf5067687b4b86e4ee01354fff4f29401219793ddf8dc0abad" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.439814 4742 scope.go:117] "RemoveContainer" containerID="1d091bcef656dd3ebddb32269857cd0cbb5b74e5e1d2213a9bc9b5964da461d3" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.446859 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6wt87"] Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.456561 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6wt87"] Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.472336 4742 scope.go:117] "RemoveContainer" containerID="fcb07cbaeb17ae1bded1123479ecb3926a6d0a3101aa37a5d3db286759e8be94" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.527079 4742 scope.go:117] "RemoveContainer" containerID="0eaf619d5b92bcbf5067687b4b86e4ee01354fff4f29401219793ddf8dc0abad" Mar 17 12:12:20 crc kubenswrapper[4742]: E0317 12:12:20.528174 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eaf619d5b92bcbf5067687b4b86e4ee01354fff4f29401219793ddf8dc0abad\": container with ID starting with 0eaf619d5b92bcbf5067687b4b86e4ee01354fff4f29401219793ddf8dc0abad not found: ID does not exist" containerID="0eaf619d5b92bcbf5067687b4b86e4ee01354fff4f29401219793ddf8dc0abad" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.528220 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eaf619d5b92bcbf5067687b4b86e4ee01354fff4f29401219793ddf8dc0abad"} err="failed to get container status \"0eaf619d5b92bcbf5067687b4b86e4ee01354fff4f29401219793ddf8dc0abad\": rpc error: code = NotFound desc = could not find container \"0eaf619d5b92bcbf5067687b4b86e4ee01354fff4f29401219793ddf8dc0abad\": container with ID starting with 0eaf619d5b92bcbf5067687b4b86e4ee01354fff4f29401219793ddf8dc0abad not found: ID does not exist" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.528245 4742 scope.go:117] "RemoveContainer" containerID="1d091bcef656dd3ebddb32269857cd0cbb5b74e5e1d2213a9bc9b5964da461d3" Mar 17 12:12:20 crc kubenswrapper[4742]: E0317 12:12:20.528742 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d091bcef656dd3ebddb32269857cd0cbb5b74e5e1d2213a9bc9b5964da461d3\": container with ID starting with 1d091bcef656dd3ebddb32269857cd0cbb5b74e5e1d2213a9bc9b5964da461d3 not found: ID does not exist" containerID="1d091bcef656dd3ebddb32269857cd0cbb5b74e5e1d2213a9bc9b5964da461d3" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.528774 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d091bcef656dd3ebddb32269857cd0cbb5b74e5e1d2213a9bc9b5964da461d3"} err="failed to get container status \"1d091bcef656dd3ebddb32269857cd0cbb5b74e5e1d2213a9bc9b5964da461d3\": rpc error: code = NotFound desc = could not find container \"1d091bcef656dd3ebddb32269857cd0cbb5b74e5e1d2213a9bc9b5964da461d3\": container with ID starting with 1d091bcef656dd3ebddb32269857cd0cbb5b74e5e1d2213a9bc9b5964da461d3 not found: ID does not exist" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.528795 4742 scope.go:117] "RemoveContainer" containerID="fcb07cbaeb17ae1bded1123479ecb3926a6d0a3101aa37a5d3db286759e8be94" Mar 17 12:12:20 crc kubenswrapper[4742]: E0317 12:12:20.529305 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb07cbaeb17ae1bded1123479ecb3926a6d0a3101aa37a5d3db286759e8be94\": container with ID starting with fcb07cbaeb17ae1bded1123479ecb3926a6d0a3101aa37a5d3db286759e8be94 not found: ID does not exist" containerID="fcb07cbaeb17ae1bded1123479ecb3926a6d0a3101aa37a5d3db286759e8be94" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.529334 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb07cbaeb17ae1bded1123479ecb3926a6d0a3101aa37a5d3db286759e8be94"} err="failed to get container status \"fcb07cbaeb17ae1bded1123479ecb3926a6d0a3101aa37a5d3db286759e8be94\": rpc error: code = NotFound desc = could not find container \"fcb07cbaeb17ae1bded1123479ecb3926a6d0a3101aa37a5d3db286759e8be94\": container with ID starting with fcb07cbaeb17ae1bded1123479ecb3926a6d0a3101aa37a5d3db286759e8be94 not found: ID does not exist" Mar 17 12:12:20 crc kubenswrapper[4742]: I0317 12:12:20.679008 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f95c5fcf-8d03-4a13-a8df-b86575a3b13e" path="/var/lib/kubelet/pods/f95c5fcf-8d03-4a13-a8df-b86575a3b13e/volumes" Mar 17 12:12:28 crc kubenswrapper[4742]: I0317 12:12:28.671293 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:12:28 crc kubenswrapper[4742]: E0317 12:12:28.672401 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:12:29 crc kubenswrapper[4742]: I0317 12:12:29.501814 4742 generic.go:334] "Generic (PLEG): container finished" podID="7e3db6d8-ce89-4d0b-8491-91cd81d476a3" containerID="72e91e07454c6d2075a2193a1d0b3014df3c7c4b731f73fd56151126ba1f7be7" exitCode=0 Mar 17 12:12:29 crc kubenswrapper[4742]: I0317 12:12:29.501939 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mltbx/crc-debug-jsjq5" event={"ID":"7e3db6d8-ce89-4d0b-8491-91cd81d476a3","Type":"ContainerDied","Data":"72e91e07454c6d2075a2193a1d0b3014df3c7c4b731f73fd56151126ba1f7be7"} Mar 17 12:12:30 crc kubenswrapper[4742]: I0317 12:12:30.601341 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/crc-debug-jsjq5" Mar 17 12:12:30 crc kubenswrapper[4742]: I0317 12:12:30.646675 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mltbx/crc-debug-jsjq5"] Mar 17 12:12:30 crc kubenswrapper[4742]: I0317 12:12:30.654865 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mltbx/crc-debug-jsjq5"] Mar 17 12:12:30 crc kubenswrapper[4742]: I0317 12:12:30.673284 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e3db6d8-ce89-4d0b-8491-91cd81d476a3-host\") pod \"7e3db6d8-ce89-4d0b-8491-91cd81d476a3\" (UID: \"7e3db6d8-ce89-4d0b-8491-91cd81d476a3\") " Mar 17 12:12:30 crc kubenswrapper[4742]: I0317 12:12:30.673382 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7e3db6d8-ce89-4d0b-8491-91cd81d476a3-host" (OuterVolumeSpecName: "host") pod "7e3db6d8-ce89-4d0b-8491-91cd81d476a3" (UID: "7e3db6d8-ce89-4d0b-8491-91cd81d476a3"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 12:12:30 crc kubenswrapper[4742]: I0317 12:12:30.673493 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brfqj\" (UniqueName: \"kubernetes.io/projected/7e3db6d8-ce89-4d0b-8491-91cd81d476a3-kube-api-access-brfqj\") pod \"7e3db6d8-ce89-4d0b-8491-91cd81d476a3\" (UID: \"7e3db6d8-ce89-4d0b-8491-91cd81d476a3\") " Mar 17 12:12:30 crc kubenswrapper[4742]: I0317 12:12:30.673980 4742 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7e3db6d8-ce89-4d0b-8491-91cd81d476a3-host\") on node \"crc\" DevicePath \"\"" Mar 17 12:12:30 crc kubenswrapper[4742]: I0317 12:12:30.678475 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3db6d8-ce89-4d0b-8491-91cd81d476a3-kube-api-access-brfqj" (OuterVolumeSpecName: "kube-api-access-brfqj") pod "7e3db6d8-ce89-4d0b-8491-91cd81d476a3" (UID: "7e3db6d8-ce89-4d0b-8491-91cd81d476a3"). InnerVolumeSpecName "kube-api-access-brfqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:12:30 crc kubenswrapper[4742]: I0317 12:12:30.775871 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brfqj\" (UniqueName: \"kubernetes.io/projected/7e3db6d8-ce89-4d0b-8491-91cd81d476a3-kube-api-access-brfqj\") on node \"crc\" DevicePath \"\"" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.523152 4742 scope.go:117] "RemoveContainer" containerID="72e91e07454c6d2075a2193a1d0b3014df3c7c4b731f73fd56151126ba1f7be7" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.523194 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/crc-debug-jsjq5" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.972166 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mltbx/crc-debug-kj79m"] Mar 17 12:12:31 crc kubenswrapper[4742]: E0317 12:12:31.972625 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95c5fcf-8d03-4a13-a8df-b86575a3b13e" containerName="registry-server" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.972642 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95c5fcf-8d03-4a13-a8df-b86575a3b13e" containerName="registry-server" Mar 17 12:12:31 crc kubenswrapper[4742]: E0317 12:12:31.972662 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95c5fcf-8d03-4a13-a8df-b86575a3b13e" containerName="extract-utilities" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.972673 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95c5fcf-8d03-4a13-a8df-b86575a3b13e" containerName="extract-utilities" Mar 17 12:12:31 crc kubenswrapper[4742]: E0317 12:12:31.972700 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e3db6d8-ce89-4d0b-8491-91cd81d476a3" containerName="container-00" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.972708 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e3db6d8-ce89-4d0b-8491-91cd81d476a3" containerName="container-00" Mar 17 12:12:31 crc kubenswrapper[4742]: E0317 12:12:31.972721 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c511e2-9f3d-45fd-a671-37a745506f9b" containerName="oc" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.972728 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c511e2-9f3d-45fd-a671-37a745506f9b" containerName="oc" Mar 17 12:12:31 crc kubenswrapper[4742]: E0317 12:12:31.972745 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95c5fcf-8d03-4a13-a8df-b86575a3b13e" containerName="extract-content" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.972754 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95c5fcf-8d03-4a13-a8df-b86575a3b13e" containerName="extract-content" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.973110 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e3db6d8-ce89-4d0b-8491-91cd81d476a3" containerName="container-00" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.973134 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c511e2-9f3d-45fd-a671-37a745506f9b" containerName="oc" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.973144 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="f95c5fcf-8d03-4a13-a8df-b86575a3b13e" containerName="registry-server" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.974100 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/crc-debug-kj79m" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.976022 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mltbx"/"default-dockercfg-rd4mc" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.997044 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqj4g\" (UniqueName: \"kubernetes.io/projected/1d81ce86-639a-4e46-9669-c4e52b7cb4e4-kube-api-access-gqj4g\") pod \"crc-debug-kj79m\" (UID: \"1d81ce86-639a-4e46-9669-c4e52b7cb4e4\") " pod="openshift-must-gather-mltbx/crc-debug-kj79m" Mar 17 12:12:31 crc kubenswrapper[4742]: I0317 12:12:31.997092 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d81ce86-639a-4e46-9669-c4e52b7cb4e4-host\") pod \"crc-debug-kj79m\" (UID: \"1d81ce86-639a-4e46-9669-c4e52b7cb4e4\") " pod="openshift-must-gather-mltbx/crc-debug-kj79m" Mar 17 12:12:32 crc kubenswrapper[4742]: I0317 12:12:32.099103 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqj4g\" (UniqueName: \"kubernetes.io/projected/1d81ce86-639a-4e46-9669-c4e52b7cb4e4-kube-api-access-gqj4g\") pod \"crc-debug-kj79m\" (UID: \"1d81ce86-639a-4e46-9669-c4e52b7cb4e4\") " pod="openshift-must-gather-mltbx/crc-debug-kj79m" Mar 17 12:12:32 crc kubenswrapper[4742]: I0317 12:12:32.099144 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d81ce86-639a-4e46-9669-c4e52b7cb4e4-host\") pod \"crc-debug-kj79m\" (UID: \"1d81ce86-639a-4e46-9669-c4e52b7cb4e4\") " pod="openshift-must-gather-mltbx/crc-debug-kj79m" Mar 17 12:12:32 crc kubenswrapper[4742]: I0317 12:12:32.099267 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d81ce86-639a-4e46-9669-c4e52b7cb4e4-host\") pod \"crc-debug-kj79m\" (UID: \"1d81ce86-639a-4e46-9669-c4e52b7cb4e4\") " pod="openshift-must-gather-mltbx/crc-debug-kj79m" Mar 17 12:12:32 crc kubenswrapper[4742]: I0317 12:12:32.117213 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqj4g\" (UniqueName: \"kubernetes.io/projected/1d81ce86-639a-4e46-9669-c4e52b7cb4e4-kube-api-access-gqj4g\") pod \"crc-debug-kj79m\" (UID: \"1d81ce86-639a-4e46-9669-c4e52b7cb4e4\") " pod="openshift-must-gather-mltbx/crc-debug-kj79m" Mar 17 12:12:32 crc kubenswrapper[4742]: I0317 12:12:32.290436 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/crc-debug-kj79m" Mar 17 12:12:32 crc kubenswrapper[4742]: I0317 12:12:32.536644 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mltbx/crc-debug-kj79m" event={"ID":"1d81ce86-639a-4e46-9669-c4e52b7cb4e4","Type":"ContainerStarted","Data":"453769648d63d1a3a6e73c3d435b498c7de1fa5258aada33187733c147fda738"} Mar 17 12:12:32 crc kubenswrapper[4742]: I0317 12:12:32.677975 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3db6d8-ce89-4d0b-8491-91cd81d476a3" path="/var/lib/kubelet/pods/7e3db6d8-ce89-4d0b-8491-91cd81d476a3/volumes" Mar 17 12:12:33 crc kubenswrapper[4742]: I0317 12:12:33.546281 4742 generic.go:334] "Generic (PLEG): container finished" podID="1d81ce86-639a-4e46-9669-c4e52b7cb4e4" containerID="8c7ff93425791d27aba1b6ed83c4fe388b793daedf6382799714920c18f6d8e1" exitCode=0 Mar 17 12:12:33 crc kubenswrapper[4742]: I0317 12:12:33.546333 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mltbx/crc-debug-kj79m" event={"ID":"1d81ce86-639a-4e46-9669-c4e52b7cb4e4","Type":"ContainerDied","Data":"8c7ff93425791d27aba1b6ed83c4fe388b793daedf6382799714920c18f6d8e1"} Mar 17 12:12:33 crc kubenswrapper[4742]: I0317 12:12:33.971618 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mltbx/crc-debug-kj79m"] Mar 17 12:12:33 crc kubenswrapper[4742]: I0317 12:12:33.979645 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mltbx/crc-debug-kj79m"] Mar 17 12:12:34 crc kubenswrapper[4742]: I0317 12:12:34.678955 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/crc-debug-kj79m" Mar 17 12:12:34 crc kubenswrapper[4742]: I0317 12:12:34.744875 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqj4g\" (UniqueName: \"kubernetes.io/projected/1d81ce86-639a-4e46-9669-c4e52b7cb4e4-kube-api-access-gqj4g\") pod \"1d81ce86-639a-4e46-9669-c4e52b7cb4e4\" (UID: \"1d81ce86-639a-4e46-9669-c4e52b7cb4e4\") " Mar 17 12:12:34 crc kubenswrapper[4742]: I0317 12:12:34.745421 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d81ce86-639a-4e46-9669-c4e52b7cb4e4-host\") pod \"1d81ce86-639a-4e46-9669-c4e52b7cb4e4\" (UID: \"1d81ce86-639a-4e46-9669-c4e52b7cb4e4\") " Mar 17 12:12:34 crc kubenswrapper[4742]: I0317 12:12:34.745497 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d81ce86-639a-4e46-9669-c4e52b7cb4e4-host" (OuterVolumeSpecName: "host") pod "1d81ce86-639a-4e46-9669-c4e52b7cb4e4" (UID: "1d81ce86-639a-4e46-9669-c4e52b7cb4e4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 12:12:34 crc kubenswrapper[4742]: I0317 12:12:34.746290 4742 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d81ce86-639a-4e46-9669-c4e52b7cb4e4-host\") on node \"crc\" DevicePath \"\"" Mar 17 12:12:34 crc kubenswrapper[4742]: I0317 12:12:34.750933 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d81ce86-639a-4e46-9669-c4e52b7cb4e4-kube-api-access-gqj4g" (OuterVolumeSpecName: "kube-api-access-gqj4g") pod "1d81ce86-639a-4e46-9669-c4e52b7cb4e4" (UID: "1d81ce86-639a-4e46-9669-c4e52b7cb4e4"). InnerVolumeSpecName "kube-api-access-gqj4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:12:34 crc kubenswrapper[4742]: I0317 12:12:34.847661 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqj4g\" (UniqueName: \"kubernetes.io/projected/1d81ce86-639a-4e46-9669-c4e52b7cb4e4-kube-api-access-gqj4g\") on node \"crc\" DevicePath \"\"" Mar 17 12:12:35 crc kubenswrapper[4742]: I0317 12:12:35.203492 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mltbx/crc-debug-bfw5h"] Mar 17 12:12:35 crc kubenswrapper[4742]: E0317 12:12:35.204067 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d81ce86-639a-4e46-9669-c4e52b7cb4e4" containerName="container-00" Mar 17 12:12:35 crc kubenswrapper[4742]: I0317 12:12:35.204083 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d81ce86-639a-4e46-9669-c4e52b7cb4e4" containerName="container-00" Mar 17 12:12:35 crc kubenswrapper[4742]: I0317 12:12:35.204288 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d81ce86-639a-4e46-9669-c4e52b7cb4e4" containerName="container-00" Mar 17 12:12:35 crc kubenswrapper[4742]: I0317 12:12:35.204860 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/crc-debug-bfw5h" Mar 17 12:12:35 crc kubenswrapper[4742]: I0317 12:12:35.254343 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwxms\" (UniqueName: \"kubernetes.io/projected/67b20216-9389-48ee-bcb0-50b7981401e4-kube-api-access-rwxms\") pod \"crc-debug-bfw5h\" (UID: \"67b20216-9389-48ee-bcb0-50b7981401e4\") " pod="openshift-must-gather-mltbx/crc-debug-bfw5h" Mar 17 12:12:35 crc kubenswrapper[4742]: I0317 12:12:35.254652 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67b20216-9389-48ee-bcb0-50b7981401e4-host\") pod \"crc-debug-bfw5h\" (UID: \"67b20216-9389-48ee-bcb0-50b7981401e4\") " pod="openshift-must-gather-mltbx/crc-debug-bfw5h" Mar 17 12:12:35 crc kubenswrapper[4742]: I0317 12:12:35.357107 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwxms\" (UniqueName: \"kubernetes.io/projected/67b20216-9389-48ee-bcb0-50b7981401e4-kube-api-access-rwxms\") pod \"crc-debug-bfw5h\" (UID: \"67b20216-9389-48ee-bcb0-50b7981401e4\") " pod="openshift-must-gather-mltbx/crc-debug-bfw5h" Mar 17 12:12:35 crc kubenswrapper[4742]: I0317 12:12:35.357255 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67b20216-9389-48ee-bcb0-50b7981401e4-host\") pod \"crc-debug-bfw5h\" (UID: \"67b20216-9389-48ee-bcb0-50b7981401e4\") " pod="openshift-must-gather-mltbx/crc-debug-bfw5h" Mar 17 12:12:35 crc kubenswrapper[4742]: I0317 12:12:35.357511 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67b20216-9389-48ee-bcb0-50b7981401e4-host\") pod \"crc-debug-bfw5h\" (UID: \"67b20216-9389-48ee-bcb0-50b7981401e4\") " pod="openshift-must-gather-mltbx/crc-debug-bfw5h" Mar 17 12:12:35 crc kubenswrapper[4742]: I0317 12:12:35.385157 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwxms\" (UniqueName: \"kubernetes.io/projected/67b20216-9389-48ee-bcb0-50b7981401e4-kube-api-access-rwxms\") pod \"crc-debug-bfw5h\" (UID: \"67b20216-9389-48ee-bcb0-50b7981401e4\") " pod="openshift-must-gather-mltbx/crc-debug-bfw5h" Mar 17 12:12:35 crc kubenswrapper[4742]: I0317 12:12:35.518900 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/crc-debug-bfw5h" Mar 17 12:12:35 crc kubenswrapper[4742]: I0317 12:12:35.571265 4742 scope.go:117] "RemoveContainer" containerID="8c7ff93425791d27aba1b6ed83c4fe388b793daedf6382799714920c18f6d8e1" Mar 17 12:12:35 crc kubenswrapper[4742]: I0317 12:12:35.571409 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/crc-debug-kj79m" Mar 17 12:12:36 crc kubenswrapper[4742]: I0317 12:12:36.581637 4742 generic.go:334] "Generic (PLEG): container finished" podID="67b20216-9389-48ee-bcb0-50b7981401e4" containerID="6d3fe097f813380db41e822338324333e79f3f85d15760422b5f235eab814d64" exitCode=0 Mar 17 12:12:36 crc kubenswrapper[4742]: I0317 12:12:36.581752 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mltbx/crc-debug-bfw5h" event={"ID":"67b20216-9389-48ee-bcb0-50b7981401e4","Type":"ContainerDied","Data":"6d3fe097f813380db41e822338324333e79f3f85d15760422b5f235eab814d64"} Mar 17 12:12:36 crc kubenswrapper[4742]: I0317 12:12:36.582282 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mltbx/crc-debug-bfw5h" event={"ID":"67b20216-9389-48ee-bcb0-50b7981401e4","Type":"ContainerStarted","Data":"5cb5151240467f3a9eafc8cd1079a4acc29f1e4134e8c8c48379e0d3e6becefc"} Mar 17 12:12:36 crc kubenswrapper[4742]: I0317 12:12:36.626564 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mltbx/crc-debug-bfw5h"] Mar 17 12:12:36 crc kubenswrapper[4742]: I0317 12:12:36.638014 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mltbx/crc-debug-bfw5h"] Mar 17 12:12:36 crc kubenswrapper[4742]: I0317 12:12:36.675198 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d81ce86-639a-4e46-9669-c4e52b7cb4e4" path="/var/lib/kubelet/pods/1d81ce86-639a-4e46-9669-c4e52b7cb4e4/volumes" Mar 17 12:12:37 crc kubenswrapper[4742]: I0317 12:12:37.697233 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/crc-debug-bfw5h" Mar 17 12:12:37 crc kubenswrapper[4742]: I0317 12:12:37.813231 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67b20216-9389-48ee-bcb0-50b7981401e4-host\") pod \"67b20216-9389-48ee-bcb0-50b7981401e4\" (UID: \"67b20216-9389-48ee-bcb0-50b7981401e4\") " Mar 17 12:12:37 crc kubenswrapper[4742]: I0317 12:12:37.813324 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwxms\" (UniqueName: \"kubernetes.io/projected/67b20216-9389-48ee-bcb0-50b7981401e4-kube-api-access-rwxms\") pod \"67b20216-9389-48ee-bcb0-50b7981401e4\" (UID: \"67b20216-9389-48ee-bcb0-50b7981401e4\") " Mar 17 12:12:37 crc kubenswrapper[4742]: I0317 12:12:37.813349 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67b20216-9389-48ee-bcb0-50b7981401e4-host" (OuterVolumeSpecName: "host") pod "67b20216-9389-48ee-bcb0-50b7981401e4" (UID: "67b20216-9389-48ee-bcb0-50b7981401e4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 12:12:37 crc kubenswrapper[4742]: I0317 12:12:37.813894 4742 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/67b20216-9389-48ee-bcb0-50b7981401e4-host\") on node \"crc\" DevicePath \"\"" Mar 17 12:12:37 crc kubenswrapper[4742]: I0317 12:12:37.823789 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67b20216-9389-48ee-bcb0-50b7981401e4-kube-api-access-rwxms" (OuterVolumeSpecName: "kube-api-access-rwxms") pod "67b20216-9389-48ee-bcb0-50b7981401e4" (UID: "67b20216-9389-48ee-bcb0-50b7981401e4"). InnerVolumeSpecName "kube-api-access-rwxms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:12:37 crc kubenswrapper[4742]: I0317 12:12:37.916042 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwxms\" (UniqueName: \"kubernetes.io/projected/67b20216-9389-48ee-bcb0-50b7981401e4-kube-api-access-rwxms\") on node \"crc\" DevicePath \"\"" Mar 17 12:12:38 crc kubenswrapper[4742]: I0317 12:12:38.604941 4742 scope.go:117] "RemoveContainer" containerID="6d3fe097f813380db41e822338324333e79f3f85d15760422b5f235eab814d64" Mar 17 12:12:38 crc kubenswrapper[4742]: I0317 12:12:38.604995 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/crc-debug-bfw5h" Mar 17 12:12:38 crc kubenswrapper[4742]: I0317 12:12:38.676003 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67b20216-9389-48ee-bcb0-50b7981401e4" path="/var/lib/kubelet/pods/67b20216-9389-48ee-bcb0-50b7981401e4/volumes" Mar 17 12:12:40 crc kubenswrapper[4742]: I0317 12:12:40.663258 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:12:40 crc kubenswrapper[4742]: E0317 12:12:40.663823 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:12:54 crc kubenswrapper[4742]: I0317 12:12:54.202576 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f76787fd-cvxz9_f550d045-d552-4ea9-b5c8-a4e7d9ff29a1/barbican-api/0.log" Mar 17 12:12:54 crc kubenswrapper[4742]: I0317 12:12:54.256347 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f76787fd-cvxz9_f550d045-d552-4ea9-b5c8-a4e7d9ff29a1/barbican-api-log/0.log" Mar 17 12:12:54 crc kubenswrapper[4742]: I0317 12:12:54.372492 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-bf65fb77d-664w7_8ac953fc-7316-4941-920f-8298fd752c3a/barbican-keystone-listener/0.log" Mar 17 12:12:54 crc kubenswrapper[4742]: I0317 12:12:54.412325 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-bf65fb77d-664w7_8ac953fc-7316-4941-920f-8298fd752c3a/barbican-keystone-listener-log/0.log" Mar 17 12:12:54 crc kubenswrapper[4742]: I0317 12:12:54.575362 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-68ddcd6d89-6jx5j_1b377427-ca51-4054-9725-545bba6b9319/barbican-worker/0.log" Mar 17 12:12:54 crc kubenswrapper[4742]: I0317 12:12:54.648344 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-68ddcd6d89-6jx5j_1b377427-ca51-4054-9725-545bba6b9319/barbican-worker-log/0.log" Mar 17 12:12:54 crc kubenswrapper[4742]: I0317 12:12:54.662801 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc_e6bf81f0-73d3-4dde-937d-87bbea94c36e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:12:54 crc kubenswrapper[4742]: I0317 12:12:54.866004 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea32fef3-81ea-41cb-8641-3a43304683c6/ceilometer-notification-agent/0.log" Mar 17 12:12:54 crc kubenswrapper[4742]: I0317 12:12:54.874313 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea32fef3-81ea-41cb-8641-3a43304683c6/ceilometer-central-agent/0.log" Mar 17 12:12:54 crc kubenswrapper[4742]: I0317 12:12:54.906965 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea32fef3-81ea-41cb-8641-3a43304683c6/proxy-httpd/0.log" Mar 17 12:12:55 crc kubenswrapper[4742]: I0317 12:12:55.000474 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea32fef3-81ea-41cb-8641-3a43304683c6/sg-core/0.log" Mar 17 12:12:55 crc kubenswrapper[4742]: I0317 12:12:55.088888 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e36e2fb7-b344-4c81-9922-3d9bc9526261/cinder-api/0.log" Mar 17 12:12:55 crc kubenswrapper[4742]: I0317 12:12:55.109172 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e36e2fb7-b344-4c81-9922-3d9bc9526261/cinder-api-log/0.log" Mar 17 12:12:55 crc kubenswrapper[4742]: I0317 12:12:55.270385 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_12f6380b-463f-4c5b-9c4a-809c874b2ca5/probe/0.log" Mar 17 12:12:55 crc kubenswrapper[4742]: I0317 12:12:55.310123 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_12f6380b-463f-4c5b-9c4a-809c874b2ca5/cinder-scheduler/0.log" Mar 17 12:12:55 crc kubenswrapper[4742]: I0317 12:12:55.400802 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh_bd4b8d37-8f12-4560-b616-cbbed45a7cb2/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:12:55 crc kubenswrapper[4742]: I0317 12:12:55.549969 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv_bfb05f67-f7aa-480f-a4e9-3f24ee2102d4/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:12:55 crc kubenswrapper[4742]: I0317 12:12:55.598084 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-z42hc_d3035223-2765-4ce8-ac14-f53ffcca7a1b/init/0.log" Mar 17 12:12:55 crc kubenswrapper[4742]: I0317 12:12:55.663590 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:12:55 crc kubenswrapper[4742]: E0317 12:12:55.663843 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:12:55 crc kubenswrapper[4742]: I0317 12:12:55.780550 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-z42hc_d3035223-2765-4ce8-ac14-f53ffcca7a1b/dnsmasq-dns/0.log" Mar 17 12:12:55 crc kubenswrapper[4742]: I0317 12:12:55.787427 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-z42hc_d3035223-2765-4ce8-ac14-f53ffcca7a1b/init/0.log" Mar 17 12:12:55 crc kubenswrapper[4742]: I0317 12:12:55.837704 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mvrds_a8691841-aa32-407b-bbdc-97c5551ec591/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:12:55 crc kubenswrapper[4742]: I0317 12:12:55.978748 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fdc48ac3-7501-4e63-9290-bff06909b045/glance-httpd/0.log" Mar 17 12:12:56 crc kubenswrapper[4742]: I0317 12:12:56.006440 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fdc48ac3-7501-4e63-9290-bff06909b045/glance-log/0.log" Mar 17 12:12:56 crc kubenswrapper[4742]: I0317 12:12:56.193005 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c030ab26-9079-49cf-837f-c0625cfe6cc3/glance-httpd/0.log" Mar 17 12:12:56 crc kubenswrapper[4742]: I0317 12:12:56.219525 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c030ab26-9079-49cf-837f-c0625cfe6cc3/glance-log/0.log" Mar 17 12:12:56 crc kubenswrapper[4742]: I0317 12:12:56.434374 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5c4556b444-kq454_480fea20-eab5-4c68-9bc3-9b218ba0b43d/horizon/0.log" Mar 17 12:12:56 crc kubenswrapper[4742]: I0317 12:12:56.533543 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-6g95k_62491de6-4c04-49d7-82f2-124f6cceff11/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:12:56 crc kubenswrapper[4742]: I0317 12:12:56.704947 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5c4556b444-kq454_480fea20-eab5-4c68-9bc3-9b218ba0b43d/horizon-log/0.log" Mar 17 12:12:56 crc kubenswrapper[4742]: I0317 12:12:56.744001 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lk8g5_71aa9411-3abc-46dd-9907-3f2847f83866/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:12:57 crc kubenswrapper[4742]: I0317 12:12:57.013835 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cf69c6b9b-d9hmq_896b4ef2-200c-4981-b22f-d93e9979c130/keystone-api/0.log" Mar 17 12:12:57 crc kubenswrapper[4742]: I0317 12:12:57.142018 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29562481-pwxss_7cfb9cd7-2718-4547-a238-e62cfa4f3cb5/keystone-cron/0.log" Mar 17 12:12:57 crc kubenswrapper[4742]: I0317 12:12:57.332923 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9_7fd024b3-844f-4118-92b5-81dcc6da9fd6/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:12:57 crc kubenswrapper[4742]: I0317 12:12:57.353496 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_47db9f5f-1a39-4137-bc97-bf3192c64ced/kube-state-metrics/0.log" Mar 17 12:12:57 crc kubenswrapper[4742]: I0317 12:12:57.435131 4742 scope.go:117] "RemoveContainer" containerID="927b24b4bc5a0eba7616b925f65e5e5173963560106466454d3fdf5e05542eb0" Mar 17 12:12:57 crc kubenswrapper[4742]: I0317 12:12:57.754050 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c7d48c699-86xxh_1ccfa960-12b9-4537-b822-89da493f780c/neutron-api/0.log" Mar 17 12:12:57 crc kubenswrapper[4742]: I0317 12:12:57.823018 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c7d48c699-86xxh_1ccfa960-12b9-4537-b822-89da493f780c/neutron-httpd/0.log" Mar 17 12:12:57 crc kubenswrapper[4742]: I0317 12:12:57.984749 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt_764bf75a-9487-4005-b6ee-ca369e722c4a/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:12:58 crc kubenswrapper[4742]: I0317 12:12:58.459758 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f2b889c9-de23-4357-956c-1684e42c64de/nova-api-log/0.log" Mar 17 12:12:58 crc kubenswrapper[4742]: I0317 12:12:58.469120 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_32a50429-d785-408b-b53f-fef4700692c6/nova-cell0-conductor-conductor/0.log" Mar 17 12:12:58 crc kubenswrapper[4742]: I0317 12:12:58.766051 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_47ecd8fa-016c-43b5-9d9f-42c776c8e38d/nova-cell1-conductor-conductor/0.log" Mar 17 12:12:58 crc kubenswrapper[4742]: I0317 12:12:58.782183 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f2b889c9-de23-4357-956c-1684e42c64de/nova-api-api/0.log" Mar 17 12:12:58 crc kubenswrapper[4742]: I0317 12:12:58.796997 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_00a7a363-ec82-40a4-8121-fd6839727132/nova-cell1-novncproxy-novncproxy/0.log" Mar 17 12:12:58 crc kubenswrapper[4742]: I0317 12:12:58.987141 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-76jn7_6468192a-58e3-4b66-9551-1d67dc93f0ae/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:12:59 crc kubenswrapper[4742]: I0317 12:12:59.108285 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8f6a1398-04d6-4668-9689-17bdbb214850/nova-metadata-log/0.log" Mar 17 12:12:59 crc kubenswrapper[4742]: I0317 12:12:59.356031 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5c8591a2-6548-4bcb-8be3-71e549605bd2/nova-scheduler-scheduler/0.log" Mar 17 12:12:59 crc kubenswrapper[4742]: I0317 12:12:59.427713 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5/mysql-bootstrap/0.log" Mar 17 12:12:59 crc kubenswrapper[4742]: I0317 12:12:59.490822 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8f6a1398-04d6-4668-9689-17bdbb214850/nova-metadata-metadata/0.log" Mar 17 12:12:59 crc kubenswrapper[4742]: I0317 12:12:59.678137 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5/mysql-bootstrap/0.log" Mar 17 12:12:59 crc kubenswrapper[4742]: I0317 12:12:59.730156 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5/galera/0.log" Mar 17 12:12:59 crc kubenswrapper[4742]: I0317 12:12:59.740358 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_91d27a2f-a471-4f90-aabb-9a021036805e/mysql-bootstrap/0.log" Mar 17 12:12:59 crc kubenswrapper[4742]: I0317 12:12:59.981250 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_91d27a2f-a471-4f90-aabb-9a021036805e/galera/0.log" Mar 17 12:12:59 crc kubenswrapper[4742]: I0317 12:12:59.995069 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_91d27a2f-a471-4f90-aabb-9a021036805e/mysql-bootstrap/0.log" Mar 17 12:13:00 crc kubenswrapper[4742]: I0317 12:13:00.024689 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_11e12da8-9e80-453f-bbbd-03d1346afe5b/openstackclient/0.log" Mar 17 12:13:00 crc kubenswrapper[4742]: I0317 12:13:00.165281 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-4j5jz_158a0d7f-e22f-4f44-aca2-efb59ff90439/ovn-controller/0.log" Mar 17 12:13:00 crc kubenswrapper[4742]: I0317 12:13:00.460432 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pmxjd_0a50ef5e-ba73-4d00-baba-b8ef6c621d71/openstack-network-exporter/0.log" Mar 17 12:13:00 crc kubenswrapper[4742]: I0317 12:13:00.645762 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dmqzv_dd5cf259-c4bf-44cf-b101-bcc78c153852/ovsdb-server-init/0.log" Mar 17 12:13:00 crc kubenswrapper[4742]: I0317 12:13:00.811772 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dmqzv_dd5cf259-c4bf-44cf-b101-bcc78c153852/ovsdb-server-init/0.log" Mar 17 12:13:00 crc kubenswrapper[4742]: I0317 12:13:00.851530 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dmqzv_dd5cf259-c4bf-44cf-b101-bcc78c153852/ovs-vswitchd/0.log" Mar 17 12:13:00 crc kubenswrapper[4742]: I0317 12:13:00.870975 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dmqzv_dd5cf259-c4bf-44cf-b101-bcc78c153852/ovsdb-server/0.log" Mar 17 12:13:01 crc kubenswrapper[4742]: I0317 12:13:01.070800 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-zgs6w_9e7470ef-476f-4d0e-b7ec-349fbc6eff76/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:13:01 crc kubenswrapper[4742]: I0317 12:13:01.100825 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_56194d57-077f-40f4-87f6-386942ac0f6b/openstack-network-exporter/0.log" Mar 17 12:13:01 crc kubenswrapper[4742]: I0317 12:13:01.145484 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_56194d57-077f-40f4-87f6-386942ac0f6b/ovn-northd/0.log" Mar 17 12:13:01 crc kubenswrapper[4742]: I0317 12:13:01.316266 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7a4f3d5f-526a-4163-8dbb-a019050a0e03/ovsdbserver-nb/0.log" Mar 17 12:13:01 crc kubenswrapper[4742]: I0317 12:13:01.333076 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7a4f3d5f-526a-4163-8dbb-a019050a0e03/openstack-network-exporter/0.log" Mar 17 12:13:01 crc kubenswrapper[4742]: I0317 12:13:01.524387 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3bba6aef-f8ff-436a-b3c1-97fbe9819ff1/ovsdbserver-sb/0.log" Mar 17 12:13:01 crc kubenswrapper[4742]: I0317 12:13:01.579864 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3bba6aef-f8ff-436a-b3c1-97fbe9819ff1/openstack-network-exporter/0.log" Mar 17 12:13:01 crc kubenswrapper[4742]: I0317 12:13:01.669520 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6976ff4586-bgqjp_221187ef-dec0-47dd-894e-ff9f2d1daa09/placement-api/0.log" Mar 17 12:13:01 crc kubenswrapper[4742]: I0317 12:13:01.848041 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6c10e471-26c3-41ec-bf47-a5edf33c173d/setup-container/0.log" Mar 17 12:13:01 crc kubenswrapper[4742]: I0317 12:13:01.848086 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6976ff4586-bgqjp_221187ef-dec0-47dd-894e-ff9f2d1daa09/placement-log/0.log" Mar 17 12:13:02 crc kubenswrapper[4742]: I0317 12:13:02.118189 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6c10e471-26c3-41ec-bf47-a5edf33c173d/setup-container/0.log" Mar 17 12:13:02 crc kubenswrapper[4742]: I0317 12:13:02.140358 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4e8c9887-8315-444e-b3dd-9753e83f83fa/setup-container/0.log" Mar 17 12:13:02 crc kubenswrapper[4742]: I0317 12:13:02.197654 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6c10e471-26c3-41ec-bf47-a5edf33c173d/rabbitmq/0.log" Mar 17 12:13:02 crc kubenswrapper[4742]: I0317 12:13:02.397939 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4e8c9887-8315-444e-b3dd-9753e83f83fa/setup-container/0.log" Mar 17 12:13:02 crc kubenswrapper[4742]: I0317 12:13:02.424214 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4e8c9887-8315-444e-b3dd-9753e83f83fa/rabbitmq/0.log" Mar 17 12:13:02 crc kubenswrapper[4742]: I0317 12:13:02.461142 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p_aa52e3ae-e09a-4561-990a-59358b9b17b6/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:13:02 crc kubenswrapper[4742]: I0317 12:13:02.610899 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-l8dw6_529b4c5a-8be2-4820-b06a-11eb75c3dc3b/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:13:02 crc kubenswrapper[4742]: I0317 12:13:02.672440 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4_abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:13:02 crc kubenswrapper[4742]: I0317 12:13:02.899652 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7mg2n_2c1f61c9-540b-4044-ba34-2bb110401fa0/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:13:02 crc kubenswrapper[4742]: I0317 12:13:02.918125 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-p4fvs_ba90bc1a-0e57-455d-8594-4e11b1548097/ssh-known-hosts-edpm-deployment/0.log" Mar 17 12:13:03 crc kubenswrapper[4742]: I0317 12:13:03.154143 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c96b95bb7-ckpvc_b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe/proxy-server/0.log" Mar 17 12:13:03 crc kubenswrapper[4742]: I0317 12:13:03.305361 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-rrnw9_3cc5195f-ecc0-4f8e-bc53-ea602fff501d/swift-ring-rebalance/0.log" Mar 17 12:13:03 crc kubenswrapper[4742]: I0317 12:13:03.311978 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c96b95bb7-ckpvc_b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe/proxy-httpd/0.log" Mar 17 12:13:03 crc kubenswrapper[4742]: I0317 12:13:03.381550 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/account-auditor/0.log" Mar 17 12:13:03 crc kubenswrapper[4742]: I0317 12:13:03.534255 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/account-reaper/0.log" Mar 17 12:13:03 crc kubenswrapper[4742]: I0317 12:13:03.581021 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/account-server/0.log" Mar 17 12:13:03 crc kubenswrapper[4742]: I0317 12:13:03.612674 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/account-replicator/0.log" Mar 17 12:13:03 crc kubenswrapper[4742]: I0317 12:13:03.831715 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/container-server/0.log" Mar 17 12:13:03 crc kubenswrapper[4742]: I0317 12:13:03.899691 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/container-auditor/0.log" Mar 17 12:13:03 crc kubenswrapper[4742]: I0317 12:13:03.968708 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/container-replicator/0.log" Mar 17 12:13:03 crc kubenswrapper[4742]: I0317 12:13:03.995078 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/container-updater/0.log" Mar 17 12:13:04 crc kubenswrapper[4742]: I0317 12:13:04.139480 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/object-expirer/0.log" Mar 17 12:13:04 crc kubenswrapper[4742]: I0317 12:13:04.169606 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/object-auditor/0.log" Mar 17 12:13:04 crc kubenswrapper[4742]: I0317 12:13:04.192865 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/object-replicator/0.log" Mar 17 12:13:04 crc kubenswrapper[4742]: I0317 12:13:04.233832 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/object-server/0.log" Mar 17 12:13:04 crc kubenswrapper[4742]: I0317 12:13:04.327857 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/object-updater/0.log" Mar 17 12:13:04 crc kubenswrapper[4742]: I0317 12:13:04.343774 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/rsync/0.log" Mar 17 12:13:04 crc kubenswrapper[4742]: I0317 12:13:04.366518 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/swift-recon-cron/0.log" Mar 17 12:13:04 crc kubenswrapper[4742]: I0317 12:13:04.606934 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw_24003f05-4f7d-443d-8a19-8162dae339a2/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:13:04 crc kubenswrapper[4742]: I0317 12:13:04.608807 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_cbe323de-3d55-4905-8f28-29cea959ae35/tempest-tests-tempest-tests-runner/0.log" Mar 17 12:13:04 crc kubenswrapper[4742]: I0317 12:13:04.796716 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6bfbc7cf-c913-4297-a60e-307a3829b636/test-operator-logs-container/0.log" Mar 17 12:13:04 crc kubenswrapper[4742]: I0317 12:13:04.834041 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-th85k_fe59da59-475f-4c7d-ab34-f3085125c224/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:13:07 crc kubenswrapper[4742]: I0317 12:13:07.662433 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:13:07 crc kubenswrapper[4742]: E0317 12:13:07.662968 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:13:14 crc kubenswrapper[4742]: I0317 12:13:14.167172 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5cbf7636-aea9-4186-be9f-a4b25776158c/memcached/0.log" Mar 17 12:13:22 crc kubenswrapper[4742]: I0317 12:13:22.662543 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:13:22 crc kubenswrapper[4742]: E0317 12:13:22.663272 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:13:30 crc kubenswrapper[4742]: I0317 12:13:30.775147 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-g729j_45257cde-ca39-4e50-b465-b76ea15e179e/manager/0.log" Mar 17 12:13:31 crc kubenswrapper[4742]: I0317 12:13:31.060343 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/util/0.log" Mar 17 12:13:31 crc kubenswrapper[4742]: I0317 12:13:31.243293 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/pull/0.log" Mar 17 12:13:31 crc kubenswrapper[4742]: I0317 12:13:31.300037 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/util/0.log" Mar 17 12:13:31 crc kubenswrapper[4742]: I0317 12:13:31.320650 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/pull/0.log" Mar 17 12:13:31 crc kubenswrapper[4742]: I0317 12:13:31.443472 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/util/0.log" Mar 17 12:13:31 crc kubenswrapper[4742]: I0317 12:13:31.522489 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/pull/0.log" Mar 17 12:13:31 crc kubenswrapper[4742]: I0317 12:13:31.576537 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/extract/0.log" Mar 17 12:13:31 crc kubenswrapper[4742]: I0317 12:13:31.659789 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-4phr7_27150936-220d-4247-b873-10add7124430/manager/0.log" Mar 17 12:13:31 crc kubenswrapper[4742]: I0317 12:13:31.736178 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-j5sfj_01ae7820-ca74-4237-ac4a-82b3605f2306/manager/0.log" Mar 17 12:13:31 crc kubenswrapper[4742]: I0317 12:13:31.904280 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-sq2xc_9b21605a-2c83-49df-ae0f-dfb172a1b9f5/manager/0.log" Mar 17 12:13:31 crc kubenswrapper[4742]: I0317 12:13:31.924463 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-6z2xv_a7d611a7-9728-4738-8efa-80883aa13b2b/manager/0.log" Mar 17 12:13:32 crc kubenswrapper[4742]: I0317 12:13:32.113697 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-znwjl_c8ccb584-e9e1-4eba-827e-3e7197f3133f/manager/0.log" Mar 17 12:13:32 crc kubenswrapper[4742]: I0317 12:13:32.363405 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-4mj6d_1cdb0787-4a2a-41f6-aed0-8693b2669444/manager/0.log" Mar 17 12:13:32 crc kubenswrapper[4742]: I0317 12:13:32.421085 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-njktv_c1f29dbe-e3d8-4dc0-aafe-fcd1de367544/manager/0.log" Mar 17 12:13:32 crc kubenswrapper[4742]: I0317 12:13:32.494187 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-dvbmd_b3928371-ca20-41d9-8200-36410c2df752/manager/0.log" Mar 17 12:13:32 crc kubenswrapper[4742]: I0317 12:13:32.576530 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-xjp4g_f91fb07a-de67-44ff-b6af-446891941a60/manager/0.log" Mar 17 12:13:32 crc kubenswrapper[4742]: I0317 12:13:32.742178 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-7ttcf_46b5befe-2274-4bc8-a2c4-ce8a9fc915ae/manager/0.log" Mar 17 12:13:32 crc kubenswrapper[4742]: I0317 12:13:32.833685 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-g252s_0436441e-c132-4c65-aee5-8b20461c12e1/manager/0.log" Mar 17 12:13:33 crc kubenswrapper[4742]: I0317 12:13:33.015671 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-vshmg_88b49b71-3d6b-4ca0-8943-c0d0c10b9ff9/manager/0.log" Mar 17 12:13:33 crc kubenswrapper[4742]: I0317 12:13:33.028793 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-4fvjv_c59e15b4-2341-4b9e-8887-d6b1f594dc0e/manager/0.log" Mar 17 12:13:33 crc kubenswrapper[4742]: I0317 12:13:33.144549 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-89w9s_7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0/manager/0.log" Mar 17 12:13:33 crc kubenswrapper[4742]: I0317 12:13:33.296618 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-58b7c959b5-zkf6c_30159976-f1ef-435e-b6e6-995553b51f65/operator/0.log" Mar 17 12:13:33 crc kubenswrapper[4742]: I0317 12:13:33.483603 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-d2ktx_37e024f1-44f6-48c9-ba86-323127371c28/registry-server/0.log" Mar 17 12:13:33 crc kubenswrapper[4742]: I0317 12:13:33.672085 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-zvlv9_9470c17e-90c4-4723-b3ef-af8ec6f1edc2/manager/0.log" Mar 17 12:13:33 crc kubenswrapper[4742]: I0317 12:13:33.782036 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-c9j2m_5e3c7784-527e-4f97-b035-240b7014241f/manager/0.log" Mar 17 12:13:33 crc kubenswrapper[4742]: I0317 12:13:33.964621 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-44jhz_48d26de5-4809-4a61-82c3-03cbf56c57b0/operator/0.log" Mar 17 12:13:34 crc kubenswrapper[4742]: I0317 12:13:34.314866 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-fh8v8_53837a21-9249-4ff8-aa95-bdfbb6d49f33/manager/0.log" Mar 17 12:13:34 crc kubenswrapper[4742]: I0317 12:13:34.350336 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-rzpkl_f42b3e9f-55a9-47fe-a5b8-51b36d622657/manager/0.log" Mar 17 12:13:34 crc kubenswrapper[4742]: I0317 12:13:34.404587 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c748c4754-6hffs_7d6829e2-3788-4653-91e4-bff007a7bb5d/manager/0.log" Mar 17 12:13:34 crc kubenswrapper[4742]: I0317 12:13:34.504989 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-fhqr4_b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee/manager/0.log" Mar 17 12:13:34 crc kubenswrapper[4742]: I0317 12:13:34.593337 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-rf6p4_0eaedaeb-8d0d-4fde-8b74-cdd689d56123/manager/0.log" Mar 17 12:13:34 crc kubenswrapper[4742]: I0317 12:13:34.663190 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:13:34 crc kubenswrapper[4742]: E0317 12:13:34.663385 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:13:49 crc kubenswrapper[4742]: I0317 12:13:49.663445 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:13:49 crc kubenswrapper[4742]: E0317 12:13:49.664709 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:13:53 crc kubenswrapper[4742]: I0317 12:13:53.666256 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-t2nj8_957049a3-8921-4ec9-a66c-d0fe15848fad/control-plane-machine-set-operator/0.log" Mar 17 12:13:53 crc kubenswrapper[4742]: I0317 12:13:53.796557 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bc2zs_76ed03a0-90ee-4e37-9580-d7136a7fdc5e/machine-api-operator/0.log" Mar 17 12:13:53 crc kubenswrapper[4742]: I0317 12:13:53.844964 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bc2zs_76ed03a0-90ee-4e37-9580-d7136a7fdc5e/kube-rbac-proxy/0.log" Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.152955 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562494-684q5"] Mar 17 12:14:00 crc kubenswrapper[4742]: E0317 12:14:00.154007 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b20216-9389-48ee-bcb0-50b7981401e4" containerName="container-00" Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.154023 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b20216-9389-48ee-bcb0-50b7981401e4" containerName="container-00" Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.154270 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="67b20216-9389-48ee-bcb0-50b7981401e4" containerName="container-00" Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.155047 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562494-684q5" Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.156653 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.158033 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.158236 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.184108 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562494-684q5"] Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.303388 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9rlg\" (UniqueName: \"kubernetes.io/projected/ea9d544d-eba4-4212-bfc5-7aab8dc25615-kube-api-access-k9rlg\") pod \"auto-csr-approver-29562494-684q5\" (UID: \"ea9d544d-eba4-4212-bfc5-7aab8dc25615\") " pod="openshift-infra/auto-csr-approver-29562494-684q5" Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.405304 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9rlg\" (UniqueName: \"kubernetes.io/projected/ea9d544d-eba4-4212-bfc5-7aab8dc25615-kube-api-access-k9rlg\") pod \"auto-csr-approver-29562494-684q5\" (UID: \"ea9d544d-eba4-4212-bfc5-7aab8dc25615\") " pod="openshift-infra/auto-csr-approver-29562494-684q5" Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.434599 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9rlg\" (UniqueName: \"kubernetes.io/projected/ea9d544d-eba4-4212-bfc5-7aab8dc25615-kube-api-access-k9rlg\") pod \"auto-csr-approver-29562494-684q5\" (UID: \"ea9d544d-eba4-4212-bfc5-7aab8dc25615\") " pod="openshift-infra/auto-csr-approver-29562494-684q5" Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.493176 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562494-684q5" Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.662651 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:14:00 crc kubenswrapper[4742]: E0317 12:14:00.663252 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:14:00 crc kubenswrapper[4742]: I0317 12:14:00.957818 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562494-684q5"] Mar 17 12:14:00 crc kubenswrapper[4742]: W0317 12:14:00.965159 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea9d544d_eba4_4212_bfc5_7aab8dc25615.slice/crio-f82cd670e62ca0dd9b9cd6cd522df15c8bf5318cdb75ba28a697f8f19d692ba7 WatchSource:0}: Error finding container f82cd670e62ca0dd9b9cd6cd522df15c8bf5318cdb75ba28a697f8f19d692ba7: Status 404 returned error can't find the container with id f82cd670e62ca0dd9b9cd6cd522df15c8bf5318cdb75ba28a697f8f19d692ba7 Mar 17 12:14:01 crc kubenswrapper[4742]: I0317 12:14:01.376779 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562494-684q5" event={"ID":"ea9d544d-eba4-4212-bfc5-7aab8dc25615","Type":"ContainerStarted","Data":"f82cd670e62ca0dd9b9cd6cd522df15c8bf5318cdb75ba28a697f8f19d692ba7"} Mar 17 12:14:03 crc kubenswrapper[4742]: I0317 12:14:03.394299 4742 generic.go:334] "Generic (PLEG): container finished" podID="ea9d544d-eba4-4212-bfc5-7aab8dc25615" containerID="b6e2a03c5d15ddeff1558752e4c781d4491912fdaea734c00d450fbfb1474bec" exitCode=0 Mar 17 12:14:03 crc kubenswrapper[4742]: I0317 12:14:03.394357 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562494-684q5" event={"ID":"ea9d544d-eba4-4212-bfc5-7aab8dc25615","Type":"ContainerDied","Data":"b6e2a03c5d15ddeff1558752e4c781d4491912fdaea734c00d450fbfb1474bec"} Mar 17 12:14:04 crc kubenswrapper[4742]: I0317 12:14:04.773881 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562494-684q5" Mar 17 12:14:04 crc kubenswrapper[4742]: I0317 12:14:04.888536 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9rlg\" (UniqueName: \"kubernetes.io/projected/ea9d544d-eba4-4212-bfc5-7aab8dc25615-kube-api-access-k9rlg\") pod \"ea9d544d-eba4-4212-bfc5-7aab8dc25615\" (UID: \"ea9d544d-eba4-4212-bfc5-7aab8dc25615\") " Mar 17 12:14:04 crc kubenswrapper[4742]: I0317 12:14:04.901291 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea9d544d-eba4-4212-bfc5-7aab8dc25615-kube-api-access-k9rlg" (OuterVolumeSpecName: "kube-api-access-k9rlg") pod "ea9d544d-eba4-4212-bfc5-7aab8dc25615" (UID: "ea9d544d-eba4-4212-bfc5-7aab8dc25615"). InnerVolumeSpecName "kube-api-access-k9rlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:14:04 crc kubenswrapper[4742]: I0317 12:14:04.991748 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9rlg\" (UniqueName: \"kubernetes.io/projected/ea9d544d-eba4-4212-bfc5-7aab8dc25615-kube-api-access-k9rlg\") on node \"crc\" DevicePath \"\"" Mar 17 12:14:05 crc kubenswrapper[4742]: I0317 12:14:05.416327 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562494-684q5" event={"ID":"ea9d544d-eba4-4212-bfc5-7aab8dc25615","Type":"ContainerDied","Data":"f82cd670e62ca0dd9b9cd6cd522df15c8bf5318cdb75ba28a697f8f19d692ba7"} Mar 17 12:14:05 crc kubenswrapper[4742]: I0317 12:14:05.416375 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f82cd670e62ca0dd9b9cd6cd522df15c8bf5318cdb75ba28a697f8f19d692ba7" Mar 17 12:14:05 crc kubenswrapper[4742]: I0317 12:14:05.416430 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562494-684q5" Mar 17 12:14:05 crc kubenswrapper[4742]: I0317 12:14:05.845135 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562488-d2k9r"] Mar 17 12:14:05 crc kubenswrapper[4742]: I0317 12:14:05.855833 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562488-d2k9r"] Mar 17 12:14:06 crc kubenswrapper[4742]: I0317 12:14:06.673561 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="942e9b43-8c54-4c69-8a59-f8315ce878b8" path="/var/lib/kubelet/pods/942e9b43-8c54-4c69-8a59-f8315ce878b8/volumes" Mar 17 12:14:07 crc kubenswrapper[4742]: I0317 12:14:07.032287 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-ncl69_fb8bea11-37f9-43cf-9a3c-07e54ebca5fa/cert-manager-controller/0.log" Mar 17 12:14:07 crc kubenswrapper[4742]: I0317 12:14:07.130312 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-k4cwb_a8125ed7-e435-4a7e-8b09-541af1b40820/cert-manager-cainjector/0.log" Mar 17 12:14:07 crc kubenswrapper[4742]: I0317 12:14:07.189938 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vf65m_09203846-9e2d-4748-b11f-c64b5a9c9c85/cert-manager-webhook/0.log" Mar 17 12:14:12 crc kubenswrapper[4742]: I0317 12:14:12.663220 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:14:12 crc kubenswrapper[4742]: E0317 12:14:12.664000 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:14:20 crc kubenswrapper[4742]: I0317 12:14:20.182008 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-7wxg4_a9d77ceb-2194-4bf6-809d-30ebc45c4dba/nmstate-console-plugin/0.log" Mar 17 12:14:20 crc kubenswrapper[4742]: I0317 12:14:20.331014 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-b78m6_15a73401-5a6e-4a32-99ba-4efe8182c160/nmstate-handler/0.log" Mar 17 12:14:20 crc kubenswrapper[4742]: I0317 12:14:20.380645 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-ttrvv_36b76368-76e0-42c0-944f-c799a074ff7f/kube-rbac-proxy/0.log" Mar 17 12:14:20 crc kubenswrapper[4742]: I0317 12:14:20.466395 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-ttrvv_36b76368-76e0-42c0-944f-c799a074ff7f/nmstate-metrics/0.log" Mar 17 12:14:20 crc kubenswrapper[4742]: I0317 12:14:20.553154 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-7gfv5_8ec78658-d1d9-4fa9-953c-153e38522338/nmstate-operator/0.log" Mar 17 12:14:20 crc kubenswrapper[4742]: I0317 12:14:20.661103 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-cn8bb_b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b/nmstate-webhook/0.log" Mar 17 12:14:23 crc kubenswrapper[4742]: I0317 12:14:23.664000 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:14:23 crc kubenswrapper[4742]: E0317 12:14:23.664898 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:14:36 crc kubenswrapper[4742]: I0317 12:14:36.663996 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:14:36 crc kubenswrapper[4742]: E0317 12:14:36.664789 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:14:49 crc kubenswrapper[4742]: I0317 12:14:49.662835 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:14:49 crc kubenswrapper[4742]: E0317 12:14:49.663708 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:14:50 crc kubenswrapper[4742]: I0317 12:14:50.364473 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-497xk_80e4c493-69b8-4854-b25a-5126fd02720e/controller/0.log" Mar 17 12:14:50 crc kubenswrapper[4742]: I0317 12:14:50.388661 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-497xk_80e4c493-69b8-4854-b25a-5126fd02720e/kube-rbac-proxy/0.log" Mar 17 12:14:50 crc kubenswrapper[4742]: I0317 12:14:50.549756 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-frr-files/0.log" Mar 17 12:14:50 crc kubenswrapper[4742]: I0317 12:14:50.731454 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-frr-files/0.log" Mar 17 12:14:50 crc kubenswrapper[4742]: I0317 12:14:50.738647 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-reloader/0.log" Mar 17 12:14:50 crc kubenswrapper[4742]: I0317 12:14:50.771769 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-metrics/0.log" Mar 17 12:14:50 crc kubenswrapper[4742]: I0317 12:14:50.777929 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-reloader/0.log" Mar 17 12:14:50 crc kubenswrapper[4742]: I0317 12:14:50.900665 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-frr-files/0.log" Mar 17 12:14:50 crc kubenswrapper[4742]: I0317 12:14:50.961540 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-reloader/0.log" Mar 17 12:14:50 crc kubenswrapper[4742]: I0317 12:14:50.963370 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-metrics/0.log" Mar 17 12:14:50 crc kubenswrapper[4742]: I0317 12:14:50.987381 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-metrics/0.log" Mar 17 12:14:51 crc kubenswrapper[4742]: I0317 12:14:51.194673 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-metrics/0.log" Mar 17 12:14:51 crc kubenswrapper[4742]: I0317 12:14:51.198462 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/controller/0.log" Mar 17 12:14:51 crc kubenswrapper[4742]: I0317 12:14:51.208330 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-frr-files/0.log" Mar 17 12:14:51 crc kubenswrapper[4742]: I0317 12:14:51.215388 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-reloader/0.log" Mar 17 12:14:51 crc kubenswrapper[4742]: I0317 12:14:51.361664 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/frr-metrics/0.log" Mar 17 12:14:51 crc kubenswrapper[4742]: I0317 12:14:51.395522 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/kube-rbac-proxy/0.log" Mar 17 12:14:51 crc kubenswrapper[4742]: I0317 12:14:51.439424 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/kube-rbac-proxy-frr/0.log" Mar 17 12:14:51 crc kubenswrapper[4742]: I0317 12:14:51.589583 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/reloader/0.log" Mar 17 12:14:51 crc kubenswrapper[4742]: I0317 12:14:51.713413 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-pfql6_e890c085-704d-45c9-9166-3d27780a18f6/frr-k8s-webhook-server/0.log" Mar 17 12:14:51 crc kubenswrapper[4742]: I0317 12:14:51.859549 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6cbc4688f7-5wdxf_f21bd592-6b38-41b3-a6a1-9b782891a659/manager/0.log" Mar 17 12:14:52 crc kubenswrapper[4742]: I0317 12:14:52.061244 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5df756f8d6-hq5d7_3e260a39-fc3d-48d3-90f5-151700332db7/webhook-server/0.log" Mar 17 12:14:52 crc kubenswrapper[4742]: I0317 12:14:52.150615 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-67kh2_f0349b48-f18d-415d-bb8c-2ee11d489f9e/kube-rbac-proxy/0.log" Mar 17 12:14:52 crc kubenswrapper[4742]: I0317 12:14:52.791228 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-67kh2_f0349b48-f18d-415d-bb8c-2ee11d489f9e/speaker/0.log" Mar 17 12:14:52 crc kubenswrapper[4742]: I0317 12:14:52.936006 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/frr/0.log" Mar 17 12:14:57 crc kubenswrapper[4742]: I0317 12:14:57.631097 4742 scope.go:117] "RemoveContainer" containerID="e43e575827c29d46c43fa664bb9238d3a111f9cd25c6323609e0a2f0d2bdb306" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.145580 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv"] Mar 17 12:15:00 crc kubenswrapper[4742]: E0317 12:15:00.146476 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9d544d-eba4-4212-bfc5-7aab8dc25615" containerName="oc" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.146487 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9d544d-eba4-4212-bfc5-7aab8dc25615" containerName="oc" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.146659 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea9d544d-eba4-4212-bfc5-7aab8dc25615" containerName="oc" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.147235 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.151413 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.151415 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.160161 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv"] Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.324363 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/335aa5bb-37d1-4c08-8a44-e39734a181ce-config-volume\") pod \"collect-profiles-29562495-kdqkv\" (UID: \"335aa5bb-37d1-4c08-8a44-e39734a181ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.324470 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hzqn\" (UniqueName: \"kubernetes.io/projected/335aa5bb-37d1-4c08-8a44-e39734a181ce-kube-api-access-5hzqn\") pod \"collect-profiles-29562495-kdqkv\" (UID: \"335aa5bb-37d1-4c08-8a44-e39734a181ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.324498 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/335aa5bb-37d1-4c08-8a44-e39734a181ce-secret-volume\") pod \"collect-profiles-29562495-kdqkv\" (UID: \"335aa5bb-37d1-4c08-8a44-e39734a181ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.425869 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/335aa5bb-37d1-4c08-8a44-e39734a181ce-config-volume\") pod \"collect-profiles-29562495-kdqkv\" (UID: \"335aa5bb-37d1-4c08-8a44-e39734a181ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.425967 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hzqn\" (UniqueName: \"kubernetes.io/projected/335aa5bb-37d1-4c08-8a44-e39734a181ce-kube-api-access-5hzqn\") pod \"collect-profiles-29562495-kdqkv\" (UID: \"335aa5bb-37d1-4c08-8a44-e39734a181ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.425998 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/335aa5bb-37d1-4c08-8a44-e39734a181ce-secret-volume\") pod \"collect-profiles-29562495-kdqkv\" (UID: \"335aa5bb-37d1-4c08-8a44-e39734a181ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.427942 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/335aa5bb-37d1-4c08-8a44-e39734a181ce-config-volume\") pod \"collect-profiles-29562495-kdqkv\" (UID: \"335aa5bb-37d1-4c08-8a44-e39734a181ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.434192 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/335aa5bb-37d1-4c08-8a44-e39734a181ce-secret-volume\") pod \"collect-profiles-29562495-kdqkv\" (UID: \"335aa5bb-37d1-4c08-8a44-e39734a181ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.453983 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hzqn\" (UniqueName: \"kubernetes.io/projected/335aa5bb-37d1-4c08-8a44-e39734a181ce-kube-api-access-5hzqn\") pod \"collect-profiles-29562495-kdqkv\" (UID: \"335aa5bb-37d1-4c08-8a44-e39734a181ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.463834 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" Mar 17 12:15:00 crc kubenswrapper[4742]: I0317 12:15:00.895001 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv"] Mar 17 12:15:00 crc kubenswrapper[4742]: W0317 12:15:00.914087 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod335aa5bb_37d1_4c08_8a44_e39734a181ce.slice/crio-ea17b562c5db30a4109f83d1889a492c392e07e624d4a2230a1e31a7eb12134c WatchSource:0}: Error finding container ea17b562c5db30a4109f83d1889a492c392e07e624d4a2230a1e31a7eb12134c: Status 404 returned error can't find the container with id ea17b562c5db30a4109f83d1889a492c392e07e624d4a2230a1e31a7eb12134c Mar 17 12:15:01 crc kubenswrapper[4742]: I0317 12:15:01.916523 4742 generic.go:334] "Generic (PLEG): container finished" podID="335aa5bb-37d1-4c08-8a44-e39734a181ce" containerID="ab4f0c4739083256e535ad052379f938ebeaaa08beb014976a4c3857c6ab87e3" exitCode=0 Mar 17 12:15:01 crc kubenswrapper[4742]: I0317 12:15:01.916599 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" event={"ID":"335aa5bb-37d1-4c08-8a44-e39734a181ce","Type":"ContainerDied","Data":"ab4f0c4739083256e535ad052379f938ebeaaa08beb014976a4c3857c6ab87e3"} Mar 17 12:15:01 crc kubenswrapper[4742]: I0317 12:15:01.917749 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" event={"ID":"335aa5bb-37d1-4c08-8a44-e39734a181ce","Type":"ContainerStarted","Data":"ea17b562c5db30a4109f83d1889a492c392e07e624d4a2230a1e31a7eb12134c"} Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.270473 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.381087 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/335aa5bb-37d1-4c08-8a44-e39734a181ce-secret-volume\") pod \"335aa5bb-37d1-4c08-8a44-e39734a181ce\" (UID: \"335aa5bb-37d1-4c08-8a44-e39734a181ce\") " Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.381533 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/335aa5bb-37d1-4c08-8a44-e39734a181ce-config-volume\") pod \"335aa5bb-37d1-4c08-8a44-e39734a181ce\" (UID: \"335aa5bb-37d1-4c08-8a44-e39734a181ce\") " Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.381599 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hzqn\" (UniqueName: \"kubernetes.io/projected/335aa5bb-37d1-4c08-8a44-e39734a181ce-kube-api-access-5hzqn\") pod \"335aa5bb-37d1-4c08-8a44-e39734a181ce\" (UID: \"335aa5bb-37d1-4c08-8a44-e39734a181ce\") " Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.384308 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/335aa5bb-37d1-4c08-8a44-e39734a181ce-config-volume" (OuterVolumeSpecName: "config-volume") pod "335aa5bb-37d1-4c08-8a44-e39734a181ce" (UID: "335aa5bb-37d1-4c08-8a44-e39734a181ce"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.386850 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/335aa5bb-37d1-4c08-8a44-e39734a181ce-kube-api-access-5hzqn" (OuterVolumeSpecName: "kube-api-access-5hzqn") pod "335aa5bb-37d1-4c08-8a44-e39734a181ce" (UID: "335aa5bb-37d1-4c08-8a44-e39734a181ce"). InnerVolumeSpecName "kube-api-access-5hzqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.399057 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/335aa5bb-37d1-4c08-8a44-e39734a181ce-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "335aa5bb-37d1-4c08-8a44-e39734a181ce" (UID: "335aa5bb-37d1-4c08-8a44-e39734a181ce"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.483366 4742 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/335aa5bb-37d1-4c08-8a44-e39734a181ce-config-volume\") on node \"crc\" DevicePath \"\"" Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.483644 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hzqn\" (UniqueName: \"kubernetes.io/projected/335aa5bb-37d1-4c08-8a44-e39734a181ce-kube-api-access-5hzqn\") on node \"crc\" DevicePath \"\"" Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.483708 4742 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/335aa5bb-37d1-4c08-8a44-e39734a181ce-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.663865 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:15:03 crc kubenswrapper[4742]: E0317 12:15:03.664297 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.941644 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" event={"ID":"335aa5bb-37d1-4c08-8a44-e39734a181ce","Type":"ContainerDied","Data":"ea17b562c5db30a4109f83d1889a492c392e07e624d4a2230a1e31a7eb12134c"} Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.941718 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea17b562c5db30a4109f83d1889a492c392e07e624d4a2230a1e31a7eb12134c" Mar 17 12:15:03 crc kubenswrapper[4742]: I0317 12:15:03.941740 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29562495-kdqkv" Mar 17 12:15:04 crc kubenswrapper[4742]: I0317 12:15:04.356180 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v"] Mar 17 12:15:04 crc kubenswrapper[4742]: I0317 12:15:04.370335 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29562450-5zd5v"] Mar 17 12:15:04 crc kubenswrapper[4742]: I0317 12:15:04.674297 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7057d36e-e38b-41f9-98f1-7f136f859aec" path="/var/lib/kubelet/pods/7057d36e-e38b-41f9-98f1-7f136f859aec/volumes" Mar 17 12:15:06 crc kubenswrapper[4742]: I0317 12:15:06.626054 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/util/0.log" Mar 17 12:15:06 crc kubenswrapper[4742]: I0317 12:15:06.745817 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/util/0.log" Mar 17 12:15:06 crc kubenswrapper[4742]: I0317 12:15:06.788782 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/pull/0.log" Mar 17 12:15:06 crc kubenswrapper[4742]: I0317 12:15:06.807777 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/pull/0.log" Mar 17 12:15:06 crc kubenswrapper[4742]: I0317 12:15:06.955444 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/util/0.log" Mar 17 12:15:06 crc kubenswrapper[4742]: I0317 12:15:06.971226 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/pull/0.log" Mar 17 12:15:07 crc kubenswrapper[4742]: I0317 12:15:07.012180 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/extract/0.log" Mar 17 12:15:07 crc kubenswrapper[4742]: I0317 12:15:07.127174 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/util/0.log" Mar 17 12:15:07 crc kubenswrapper[4742]: I0317 12:15:07.309200 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/util/0.log" Mar 17 12:15:07 crc kubenswrapper[4742]: I0317 12:15:07.310163 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/pull/0.log" Mar 17 12:15:07 crc kubenswrapper[4742]: I0317 12:15:07.338509 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/pull/0.log" Mar 17 12:15:07 crc kubenswrapper[4742]: I0317 12:15:07.488019 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/util/0.log" Mar 17 12:15:07 crc kubenswrapper[4742]: I0317 12:15:07.550003 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/pull/0.log" Mar 17 12:15:07 crc kubenswrapper[4742]: I0317 12:15:07.572468 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/extract/0.log" Mar 17 12:15:07 crc kubenswrapper[4742]: I0317 12:15:07.645679 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/extract-utilities/0.log" Mar 17 12:15:07 crc kubenswrapper[4742]: I0317 12:15:07.813473 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/extract-content/0.log" Mar 17 12:15:07 crc kubenswrapper[4742]: I0317 12:15:07.840684 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/extract-utilities/0.log" Mar 17 12:15:07 crc kubenswrapper[4742]: I0317 12:15:07.845390 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/extract-content/0.log" Mar 17 12:15:08 crc kubenswrapper[4742]: I0317 12:15:08.000871 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/extract-content/0.log" Mar 17 12:15:08 crc kubenswrapper[4742]: I0317 12:15:08.012492 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/extract-utilities/0.log" Mar 17 12:15:08 crc kubenswrapper[4742]: I0317 12:15:08.269336 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/extract-utilities/0.log" Mar 17 12:15:08 crc kubenswrapper[4742]: I0317 12:15:08.405683 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/extract-utilities/0.log" Mar 17 12:15:08 crc kubenswrapper[4742]: I0317 12:15:08.449721 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/extract-content/0.log" Mar 17 12:15:08 crc kubenswrapper[4742]: I0317 12:15:08.522181 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/extract-content/0.log" Mar 17 12:15:08 crc kubenswrapper[4742]: I0317 12:15:08.707116 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/registry-server/0.log" Mar 17 12:15:08 crc kubenswrapper[4742]: I0317 12:15:08.712471 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/extract-utilities/0.log" Mar 17 12:15:08 crc kubenswrapper[4742]: I0317 12:15:08.745455 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/extract-content/0.log" Mar 17 12:15:08 crc kubenswrapper[4742]: I0317 12:15:08.908029 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rxctp_66e4c4dd-b0fe-4877-8520-bdbd18b096d4/marketplace-operator/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.120466 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/registry-server/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.129056 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/extract-utilities/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.263685 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/extract-utilities/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.265065 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/extract-content/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.268854 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/extract-content/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.424119 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/extract-content/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.454859 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/extract-utilities/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.547798 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/registry-server/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.625966 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p52h7_a52e996e-9305-4a8a-bb51-9d2d72223dcf/extract-utilities/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.753672 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p52h7_a52e996e-9305-4a8a-bb51-9d2d72223dcf/extract-content/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.762245 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p52h7_a52e996e-9305-4a8a-bb51-9d2d72223dcf/extract-utilities/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.820289 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p52h7_a52e996e-9305-4a8a-bb51-9d2d72223dcf/extract-content/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.947270 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p52h7_a52e996e-9305-4a8a-bb51-9d2d72223dcf/extract-utilities/0.log" Mar 17 12:15:09 crc kubenswrapper[4742]: I0317 12:15:09.977854 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p52h7_a52e996e-9305-4a8a-bb51-9d2d72223dcf/extract-content/0.log" Mar 17 12:15:10 crc kubenswrapper[4742]: I0317 12:15:10.602413 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-p52h7_a52e996e-9305-4a8a-bb51-9d2d72223dcf/registry-server/0.log" Mar 17 12:15:17 crc kubenswrapper[4742]: I0317 12:15:17.663706 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:15:17 crc kubenswrapper[4742]: E0317 12:15:17.664897 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:15:27 crc kubenswrapper[4742]: E0317 12:15:27.267163 4742 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.230:45738->38.102.83.230:42569: write tcp 38.102.83.230:45738->38.102.83.230:42569: write: connection reset by peer Mar 17 12:15:29 crc kubenswrapper[4742]: I0317 12:15:29.663343 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:15:29 crc kubenswrapper[4742]: E0317 12:15:29.664491 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.428168 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lscnq"] Mar 17 12:15:31 crc kubenswrapper[4742]: E0317 12:15:31.428732 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="335aa5bb-37d1-4c08-8a44-e39734a181ce" containerName="collect-profiles" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.428744 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="335aa5bb-37d1-4c08-8a44-e39734a181ce" containerName="collect-profiles" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.428896 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="335aa5bb-37d1-4c08-8a44-e39734a181ce" containerName="collect-profiles" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.430088 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.446359 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lscnq"] Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.627768 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2cca09-4d0a-4038-9136-fd982105b028-utilities\") pod \"community-operators-lscnq\" (UID: \"2d2cca09-4d0a-4038-9136-fd982105b028\") " pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.627836 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5gx6\" (UniqueName: \"kubernetes.io/projected/2d2cca09-4d0a-4038-9136-fd982105b028-kube-api-access-j5gx6\") pod \"community-operators-lscnq\" (UID: \"2d2cca09-4d0a-4038-9136-fd982105b028\") " pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.627990 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2cca09-4d0a-4038-9136-fd982105b028-catalog-content\") pod \"community-operators-lscnq\" (UID: \"2d2cca09-4d0a-4038-9136-fd982105b028\") " pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.730076 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2cca09-4d0a-4038-9136-fd982105b028-catalog-content\") pod \"community-operators-lscnq\" (UID: \"2d2cca09-4d0a-4038-9136-fd982105b028\") " pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.730224 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2cca09-4d0a-4038-9136-fd982105b028-utilities\") pod \"community-operators-lscnq\" (UID: \"2d2cca09-4d0a-4038-9136-fd982105b028\") " pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.730268 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5gx6\" (UniqueName: \"kubernetes.io/projected/2d2cca09-4d0a-4038-9136-fd982105b028-kube-api-access-j5gx6\") pod \"community-operators-lscnq\" (UID: \"2d2cca09-4d0a-4038-9136-fd982105b028\") " pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.730848 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2cca09-4d0a-4038-9136-fd982105b028-catalog-content\") pod \"community-operators-lscnq\" (UID: \"2d2cca09-4d0a-4038-9136-fd982105b028\") " pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.730926 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2cca09-4d0a-4038-9136-fd982105b028-utilities\") pod \"community-operators-lscnq\" (UID: \"2d2cca09-4d0a-4038-9136-fd982105b028\") " pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.753124 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5gx6\" (UniqueName: \"kubernetes.io/projected/2d2cca09-4d0a-4038-9136-fd982105b028-kube-api-access-j5gx6\") pod \"community-operators-lscnq\" (UID: \"2d2cca09-4d0a-4038-9136-fd982105b028\") " pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:31 crc kubenswrapper[4742]: I0317 12:15:31.760232 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:32 crc kubenswrapper[4742]: I0317 12:15:32.314115 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lscnq"] Mar 17 12:15:33 crc kubenswrapper[4742]: I0317 12:15:33.226126 4742 generic.go:334] "Generic (PLEG): container finished" podID="2d2cca09-4d0a-4038-9136-fd982105b028" containerID="301cdb20d90dd7313a5b818fee3bc84c1ab5aa4f7a5817ebddb0022b851e5a4d" exitCode=0 Mar 17 12:15:33 crc kubenswrapper[4742]: I0317 12:15:33.226455 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lscnq" event={"ID":"2d2cca09-4d0a-4038-9136-fd982105b028","Type":"ContainerDied","Data":"301cdb20d90dd7313a5b818fee3bc84c1ab5aa4f7a5817ebddb0022b851e5a4d"} Mar 17 12:15:33 crc kubenswrapper[4742]: I0317 12:15:33.226487 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lscnq" event={"ID":"2d2cca09-4d0a-4038-9136-fd982105b028","Type":"ContainerStarted","Data":"e1361e7335158eddfdcb91369c5617110ea4567e8e5fd118a707ded70ca95b74"} Mar 17 12:15:33 crc kubenswrapper[4742]: I0317 12:15:33.229309 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 12:15:34 crc kubenswrapper[4742]: I0317 12:15:34.235766 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lscnq" event={"ID":"2d2cca09-4d0a-4038-9136-fd982105b028","Type":"ContainerStarted","Data":"f0c52d62d5aa8ab7966eba80c9e88f9a4be3e03bce48a2b9a8e0dfea0a43702b"} Mar 17 12:15:36 crc kubenswrapper[4742]: I0317 12:15:36.252801 4742 generic.go:334] "Generic (PLEG): container finished" podID="2d2cca09-4d0a-4038-9136-fd982105b028" containerID="f0c52d62d5aa8ab7966eba80c9e88f9a4be3e03bce48a2b9a8e0dfea0a43702b" exitCode=0 Mar 17 12:15:36 crc kubenswrapper[4742]: I0317 12:15:36.253078 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lscnq" event={"ID":"2d2cca09-4d0a-4038-9136-fd982105b028","Type":"ContainerDied","Data":"f0c52d62d5aa8ab7966eba80c9e88f9a4be3e03bce48a2b9a8e0dfea0a43702b"} Mar 17 12:15:37 crc kubenswrapper[4742]: I0317 12:15:37.277864 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lscnq" event={"ID":"2d2cca09-4d0a-4038-9136-fd982105b028","Type":"ContainerStarted","Data":"0ee07661525a1ac48c43d9f0e95911130647d9f6406d3daa5671c4072efc1748"} Mar 17 12:15:37 crc kubenswrapper[4742]: I0317 12:15:37.300419 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lscnq" podStartSLOduration=2.612147891 podStartE2EDuration="6.300400521s" podCreationTimestamp="2026-03-17 12:15:31 +0000 UTC" firstStartedPulling="2026-03-17 12:15:33.228781136 +0000 UTC m=+3836.354908924" lastFinishedPulling="2026-03-17 12:15:36.917033796 +0000 UTC m=+3840.043161554" observedRunningTime="2026-03-17 12:15:37.296397922 +0000 UTC m=+3840.422525700" watchObservedRunningTime="2026-03-17 12:15:37.300400521 +0000 UTC m=+3840.426528279" Mar 17 12:15:38 crc kubenswrapper[4742]: I0317 12:15:38.168140 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8s4km"] Mar 17 12:15:38 crc kubenswrapper[4742]: I0317 12:15:38.170072 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:38 crc kubenswrapper[4742]: I0317 12:15:38.181658 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8s4km"] Mar 17 12:15:38 crc kubenswrapper[4742]: I0317 12:15:38.355398 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-catalog-content\") pod \"redhat-marketplace-8s4km\" (UID: \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\") " pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:38 crc kubenswrapper[4742]: I0317 12:15:38.355551 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh98h\" (UniqueName: \"kubernetes.io/projected/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-kube-api-access-fh98h\") pod \"redhat-marketplace-8s4km\" (UID: \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\") " pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:38 crc kubenswrapper[4742]: I0317 12:15:38.355580 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-utilities\") pod \"redhat-marketplace-8s4km\" (UID: \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\") " pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:38 crc kubenswrapper[4742]: I0317 12:15:38.457719 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh98h\" (UniqueName: \"kubernetes.io/projected/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-kube-api-access-fh98h\") pod \"redhat-marketplace-8s4km\" (UID: \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\") " pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:38 crc kubenswrapper[4742]: I0317 12:15:38.457778 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-utilities\") pod \"redhat-marketplace-8s4km\" (UID: \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\") " pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:38 crc kubenswrapper[4742]: I0317 12:15:38.457961 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-catalog-content\") pod \"redhat-marketplace-8s4km\" (UID: \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\") " pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:38 crc kubenswrapper[4742]: I0317 12:15:38.458408 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-utilities\") pod \"redhat-marketplace-8s4km\" (UID: \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\") " pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:38 crc kubenswrapper[4742]: I0317 12:15:38.458460 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-catalog-content\") pod \"redhat-marketplace-8s4km\" (UID: \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\") " pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:38 crc kubenswrapper[4742]: I0317 12:15:38.487890 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh98h\" (UniqueName: \"kubernetes.io/projected/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-kube-api-access-fh98h\") pod \"redhat-marketplace-8s4km\" (UID: \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\") " pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:38 crc kubenswrapper[4742]: I0317 12:15:38.489155 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:39 crc kubenswrapper[4742]: I0317 12:15:39.037886 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8s4km"] Mar 17 12:15:39 crc kubenswrapper[4742]: I0317 12:15:39.295806 4742 generic.go:334] "Generic (PLEG): container finished" podID="b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" containerID="a75e7909a2af2585bf130965c3120169458f645bbde0b6ddea8d8f04be389a25" exitCode=0 Mar 17 12:15:39 crc kubenswrapper[4742]: I0317 12:15:39.295851 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s4km" event={"ID":"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5","Type":"ContainerDied","Data":"a75e7909a2af2585bf130965c3120169458f645bbde0b6ddea8d8f04be389a25"} Mar 17 12:15:39 crc kubenswrapper[4742]: I0317 12:15:39.295885 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s4km" event={"ID":"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5","Type":"ContainerStarted","Data":"5526b578e7f4b986ca9b646dbec5a8bc2af496e9cf90995cb7331ca99ba2b3d6"} Mar 17 12:15:40 crc kubenswrapper[4742]: I0317 12:15:40.305745 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s4km" event={"ID":"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5","Type":"ContainerStarted","Data":"67d1906deb1463c0a5b6a33e636f1b5f2c08d7754bcfb8916f238cd889b7126f"} Mar 17 12:15:41 crc kubenswrapper[4742]: I0317 12:15:41.315465 4742 generic.go:334] "Generic (PLEG): container finished" podID="b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" containerID="67d1906deb1463c0a5b6a33e636f1b5f2c08d7754bcfb8916f238cd889b7126f" exitCode=0 Mar 17 12:15:41 crc kubenswrapper[4742]: I0317 12:15:41.315512 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s4km" event={"ID":"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5","Type":"ContainerDied","Data":"67d1906deb1463c0a5b6a33e636f1b5f2c08d7754bcfb8916f238cd889b7126f"} Mar 17 12:15:41 crc kubenswrapper[4742]: I0317 12:15:41.760292 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:41 crc kubenswrapper[4742]: I0317 12:15:41.761049 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:42 crc kubenswrapper[4742]: I0317 12:15:42.333455 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s4km" event={"ID":"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5","Type":"ContainerStarted","Data":"864635f26ad810f471a560a21c54f12eff3f726ced78db0f5dc9bada7bbdfebc"} Mar 17 12:15:42 crc kubenswrapper[4742]: I0317 12:15:42.359430 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8s4km" podStartSLOduration=1.9106275830000001 podStartE2EDuration="4.359411574s" podCreationTimestamp="2026-03-17 12:15:38 +0000 UTC" firstStartedPulling="2026-03-17 12:15:39.297733409 +0000 UTC m=+3842.423861167" lastFinishedPulling="2026-03-17 12:15:41.7465174 +0000 UTC m=+3844.872645158" observedRunningTime="2026-03-17 12:15:42.347821319 +0000 UTC m=+3845.473949107" watchObservedRunningTime="2026-03-17 12:15:42.359411574 +0000 UTC m=+3845.485539332" Mar 17 12:15:42 crc kubenswrapper[4742]: I0317 12:15:42.826730 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-lscnq" podUID="2d2cca09-4d0a-4038-9136-fd982105b028" containerName="registry-server" probeResult="failure" output=< Mar 17 12:15:42 crc kubenswrapper[4742]: timeout: failed to connect service ":50051" within 1s Mar 17 12:15:42 crc kubenswrapper[4742]: > Mar 17 12:15:43 crc kubenswrapper[4742]: I0317 12:15:43.663512 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:15:43 crc kubenswrapper[4742]: E0317 12:15:43.664052 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:15:48 crc kubenswrapper[4742]: I0317 12:15:48.490788 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:48 crc kubenswrapper[4742]: I0317 12:15:48.491270 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:48 crc kubenswrapper[4742]: I0317 12:15:48.602403 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:49 crc kubenswrapper[4742]: I0317 12:15:49.502264 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:49 crc kubenswrapper[4742]: I0317 12:15:49.561867 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8s4km"] Mar 17 12:15:51 crc kubenswrapper[4742]: I0317 12:15:51.440380 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8s4km" podUID="b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" containerName="registry-server" containerID="cri-o://864635f26ad810f471a560a21c54f12eff3f726ced78db0f5dc9bada7bbdfebc" gracePeriod=2 Mar 17 12:15:51 crc kubenswrapper[4742]: I0317 12:15:51.830349 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:51 crc kubenswrapper[4742]: I0317 12:15:51.890567 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:51 crc kubenswrapper[4742]: I0317 12:15:51.908963 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.015325 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-catalog-content\") pod \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\" (UID: \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\") " Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.015394 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh98h\" (UniqueName: \"kubernetes.io/projected/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-kube-api-access-fh98h\") pod \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\" (UID: \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\") " Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.015462 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-utilities\") pod \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\" (UID: \"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5\") " Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.016396 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-utilities" (OuterVolumeSpecName: "utilities") pod "b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" (UID: "b600c3cc-d6a2-4ea6-a60f-76aaa957dca5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.029257 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-kube-api-access-fh98h" (OuterVolumeSpecName: "kube-api-access-fh98h") pod "b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" (UID: "b600c3cc-d6a2-4ea6-a60f-76aaa957dca5"). InnerVolumeSpecName "kube-api-access-fh98h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.050783 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" (UID: "b600c3cc-d6a2-4ea6-a60f-76aaa957dca5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.118502 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.118796 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh98h\" (UniqueName: \"kubernetes.io/projected/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-kube-api-access-fh98h\") on node \"crc\" DevicePath \"\"" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.118881 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.447188 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lscnq"] Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.456931 4742 generic.go:334] "Generic (PLEG): container finished" podID="b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" containerID="864635f26ad810f471a560a21c54f12eff3f726ced78db0f5dc9bada7bbdfebc" exitCode=0 Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.457416 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8s4km" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.457716 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s4km" event={"ID":"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5","Type":"ContainerDied","Data":"864635f26ad810f471a560a21c54f12eff3f726ced78db0f5dc9bada7bbdfebc"} Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.457742 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8s4km" event={"ID":"b600c3cc-d6a2-4ea6-a60f-76aaa957dca5","Type":"ContainerDied","Data":"5526b578e7f4b986ca9b646dbec5a8bc2af496e9cf90995cb7331ca99ba2b3d6"} Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.457757 4742 scope.go:117] "RemoveContainer" containerID="864635f26ad810f471a560a21c54f12eff3f726ced78db0f5dc9bada7bbdfebc" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.501467 4742 scope.go:117] "RemoveContainer" containerID="67d1906deb1463c0a5b6a33e636f1b5f2c08d7754bcfb8916f238cd889b7126f" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.510499 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8s4km"] Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.525052 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8s4km"] Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.532480 4742 scope.go:117] "RemoveContainer" containerID="a75e7909a2af2585bf130965c3120169458f645bbde0b6ddea8d8f04be389a25" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.585627 4742 scope.go:117] "RemoveContainer" containerID="864635f26ad810f471a560a21c54f12eff3f726ced78db0f5dc9bada7bbdfebc" Mar 17 12:15:52 crc kubenswrapper[4742]: E0317 12:15:52.586000 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"864635f26ad810f471a560a21c54f12eff3f726ced78db0f5dc9bada7bbdfebc\": container with ID starting with 864635f26ad810f471a560a21c54f12eff3f726ced78db0f5dc9bada7bbdfebc not found: ID does not exist" containerID="864635f26ad810f471a560a21c54f12eff3f726ced78db0f5dc9bada7bbdfebc" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.586030 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"864635f26ad810f471a560a21c54f12eff3f726ced78db0f5dc9bada7bbdfebc"} err="failed to get container status \"864635f26ad810f471a560a21c54f12eff3f726ced78db0f5dc9bada7bbdfebc\": rpc error: code = NotFound desc = could not find container \"864635f26ad810f471a560a21c54f12eff3f726ced78db0f5dc9bada7bbdfebc\": container with ID starting with 864635f26ad810f471a560a21c54f12eff3f726ced78db0f5dc9bada7bbdfebc not found: ID does not exist" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.586057 4742 scope.go:117] "RemoveContainer" containerID="67d1906deb1463c0a5b6a33e636f1b5f2c08d7754bcfb8916f238cd889b7126f" Mar 17 12:15:52 crc kubenswrapper[4742]: E0317 12:15:52.586539 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67d1906deb1463c0a5b6a33e636f1b5f2c08d7754bcfb8916f238cd889b7126f\": container with ID starting with 67d1906deb1463c0a5b6a33e636f1b5f2c08d7754bcfb8916f238cd889b7126f not found: ID does not exist" containerID="67d1906deb1463c0a5b6a33e636f1b5f2c08d7754bcfb8916f238cd889b7126f" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.586596 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67d1906deb1463c0a5b6a33e636f1b5f2c08d7754bcfb8916f238cd889b7126f"} err="failed to get container status \"67d1906deb1463c0a5b6a33e636f1b5f2c08d7754bcfb8916f238cd889b7126f\": rpc error: code = NotFound desc = could not find container \"67d1906deb1463c0a5b6a33e636f1b5f2c08d7754bcfb8916f238cd889b7126f\": container with ID starting with 67d1906deb1463c0a5b6a33e636f1b5f2c08d7754bcfb8916f238cd889b7126f not found: ID does not exist" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.586625 4742 scope.go:117] "RemoveContainer" containerID="a75e7909a2af2585bf130965c3120169458f645bbde0b6ddea8d8f04be389a25" Mar 17 12:15:52 crc kubenswrapper[4742]: E0317 12:15:52.586882 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75e7909a2af2585bf130965c3120169458f645bbde0b6ddea8d8f04be389a25\": container with ID starting with a75e7909a2af2585bf130965c3120169458f645bbde0b6ddea8d8f04be389a25 not found: ID does not exist" containerID="a75e7909a2af2585bf130965c3120169458f645bbde0b6ddea8d8f04be389a25" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.586919 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75e7909a2af2585bf130965c3120169458f645bbde0b6ddea8d8f04be389a25"} err="failed to get container status \"a75e7909a2af2585bf130965c3120169458f645bbde0b6ddea8d8f04be389a25\": rpc error: code = NotFound desc = could not find container \"a75e7909a2af2585bf130965c3120169458f645bbde0b6ddea8d8f04be389a25\": container with ID starting with a75e7909a2af2585bf130965c3120169458f645bbde0b6ddea8d8f04be389a25 not found: ID does not exist" Mar 17 12:15:52 crc kubenswrapper[4742]: I0317 12:15:52.678079 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" path="/var/lib/kubelet/pods/b600c3cc-d6a2-4ea6-a60f-76aaa957dca5/volumes" Mar 17 12:15:53 crc kubenswrapper[4742]: I0317 12:15:53.474696 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lscnq" podUID="2d2cca09-4d0a-4038-9136-fd982105b028" containerName="registry-server" containerID="cri-o://0ee07661525a1ac48c43d9f0e95911130647d9f6406d3daa5671c4072efc1748" gracePeriod=2 Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.091977 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.259937 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2cca09-4d0a-4038-9136-fd982105b028-utilities\") pod \"2d2cca09-4d0a-4038-9136-fd982105b028\" (UID: \"2d2cca09-4d0a-4038-9136-fd982105b028\") " Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.260253 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5gx6\" (UniqueName: \"kubernetes.io/projected/2d2cca09-4d0a-4038-9136-fd982105b028-kube-api-access-j5gx6\") pod \"2d2cca09-4d0a-4038-9136-fd982105b028\" (UID: \"2d2cca09-4d0a-4038-9136-fd982105b028\") " Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.260312 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2cca09-4d0a-4038-9136-fd982105b028-catalog-content\") pod \"2d2cca09-4d0a-4038-9136-fd982105b028\" (UID: \"2d2cca09-4d0a-4038-9136-fd982105b028\") " Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.260863 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d2cca09-4d0a-4038-9136-fd982105b028-utilities" (OuterVolumeSpecName: "utilities") pod "2d2cca09-4d0a-4038-9136-fd982105b028" (UID: "2d2cca09-4d0a-4038-9136-fd982105b028"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.266155 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2cca09-4d0a-4038-9136-fd982105b028-kube-api-access-j5gx6" (OuterVolumeSpecName: "kube-api-access-j5gx6") pod "2d2cca09-4d0a-4038-9136-fd982105b028" (UID: "2d2cca09-4d0a-4038-9136-fd982105b028"). InnerVolumeSpecName "kube-api-access-j5gx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.268448 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5gx6\" (UniqueName: \"kubernetes.io/projected/2d2cca09-4d0a-4038-9136-fd982105b028-kube-api-access-j5gx6\") on node \"crc\" DevicePath \"\"" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.268466 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2cca09-4d0a-4038-9136-fd982105b028-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.347840 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d2cca09-4d0a-4038-9136-fd982105b028-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d2cca09-4d0a-4038-9136-fd982105b028" (UID: "2d2cca09-4d0a-4038-9136-fd982105b028"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.369966 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2cca09-4d0a-4038-9136-fd982105b028-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.486025 4742 generic.go:334] "Generic (PLEG): container finished" podID="2d2cca09-4d0a-4038-9136-fd982105b028" containerID="0ee07661525a1ac48c43d9f0e95911130647d9f6406d3daa5671c4072efc1748" exitCode=0 Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.486198 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lscnq" event={"ID":"2d2cca09-4d0a-4038-9136-fd982105b028","Type":"ContainerDied","Data":"0ee07661525a1ac48c43d9f0e95911130647d9f6406d3daa5671c4072efc1748"} Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.487202 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lscnq" event={"ID":"2d2cca09-4d0a-4038-9136-fd982105b028","Type":"ContainerDied","Data":"e1361e7335158eddfdcb91369c5617110ea4567e8e5fd118a707ded70ca95b74"} Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.487370 4742 scope.go:117] "RemoveContainer" containerID="0ee07661525a1ac48c43d9f0e95911130647d9f6406d3daa5671c4072efc1748" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.486325 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lscnq" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.528860 4742 scope.go:117] "RemoveContainer" containerID="f0c52d62d5aa8ab7966eba80c9e88f9a4be3e03bce48a2b9a8e0dfea0a43702b" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.533411 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lscnq"] Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.543722 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lscnq"] Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.583714 4742 scope.go:117] "RemoveContainer" containerID="301cdb20d90dd7313a5b818fee3bc84c1ab5aa4f7a5817ebddb0022b851e5a4d" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.603731 4742 scope.go:117] "RemoveContainer" containerID="0ee07661525a1ac48c43d9f0e95911130647d9f6406d3daa5671c4072efc1748" Mar 17 12:15:54 crc kubenswrapper[4742]: E0317 12:15:54.604148 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee07661525a1ac48c43d9f0e95911130647d9f6406d3daa5671c4072efc1748\": container with ID starting with 0ee07661525a1ac48c43d9f0e95911130647d9f6406d3daa5671c4072efc1748 not found: ID does not exist" containerID="0ee07661525a1ac48c43d9f0e95911130647d9f6406d3daa5671c4072efc1748" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.604188 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee07661525a1ac48c43d9f0e95911130647d9f6406d3daa5671c4072efc1748"} err="failed to get container status \"0ee07661525a1ac48c43d9f0e95911130647d9f6406d3daa5671c4072efc1748\": rpc error: code = NotFound desc = could not find container \"0ee07661525a1ac48c43d9f0e95911130647d9f6406d3daa5671c4072efc1748\": container with ID starting with 0ee07661525a1ac48c43d9f0e95911130647d9f6406d3daa5671c4072efc1748 not found: ID does not exist" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.604215 4742 scope.go:117] "RemoveContainer" containerID="f0c52d62d5aa8ab7966eba80c9e88f9a4be3e03bce48a2b9a8e0dfea0a43702b" Mar 17 12:15:54 crc kubenswrapper[4742]: E0317 12:15:54.604446 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0c52d62d5aa8ab7966eba80c9e88f9a4be3e03bce48a2b9a8e0dfea0a43702b\": container with ID starting with f0c52d62d5aa8ab7966eba80c9e88f9a4be3e03bce48a2b9a8e0dfea0a43702b not found: ID does not exist" containerID="f0c52d62d5aa8ab7966eba80c9e88f9a4be3e03bce48a2b9a8e0dfea0a43702b" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.604476 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0c52d62d5aa8ab7966eba80c9e88f9a4be3e03bce48a2b9a8e0dfea0a43702b"} err="failed to get container status \"f0c52d62d5aa8ab7966eba80c9e88f9a4be3e03bce48a2b9a8e0dfea0a43702b\": rpc error: code = NotFound desc = could not find container \"f0c52d62d5aa8ab7966eba80c9e88f9a4be3e03bce48a2b9a8e0dfea0a43702b\": container with ID starting with f0c52d62d5aa8ab7966eba80c9e88f9a4be3e03bce48a2b9a8e0dfea0a43702b not found: ID does not exist" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.604491 4742 scope.go:117] "RemoveContainer" containerID="301cdb20d90dd7313a5b818fee3bc84c1ab5aa4f7a5817ebddb0022b851e5a4d" Mar 17 12:15:54 crc kubenswrapper[4742]: E0317 12:15:54.604680 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"301cdb20d90dd7313a5b818fee3bc84c1ab5aa4f7a5817ebddb0022b851e5a4d\": container with ID starting with 301cdb20d90dd7313a5b818fee3bc84c1ab5aa4f7a5817ebddb0022b851e5a4d not found: ID does not exist" containerID="301cdb20d90dd7313a5b818fee3bc84c1ab5aa4f7a5817ebddb0022b851e5a4d" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.604710 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"301cdb20d90dd7313a5b818fee3bc84c1ab5aa4f7a5817ebddb0022b851e5a4d"} err="failed to get container status \"301cdb20d90dd7313a5b818fee3bc84c1ab5aa4f7a5817ebddb0022b851e5a4d\": rpc error: code = NotFound desc = could not find container \"301cdb20d90dd7313a5b818fee3bc84c1ab5aa4f7a5817ebddb0022b851e5a4d\": container with ID starting with 301cdb20d90dd7313a5b818fee3bc84c1ab5aa4f7a5817ebddb0022b851e5a4d not found: ID does not exist" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.663416 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:15:54 crc kubenswrapper[4742]: E0317 12:15:54.664011 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:15:54 crc kubenswrapper[4742]: I0317 12:15:54.673644 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d2cca09-4d0a-4038-9136-fd982105b028" path="/var/lib/kubelet/pods/2d2cca09-4d0a-4038-9136-fd982105b028/volumes" Mar 17 12:15:57 crc kubenswrapper[4742]: I0317 12:15:57.733468 4742 scope.go:117] "RemoveContainer" containerID="93817ed3cdb466999dc3b44859ea51e8c6851d26ddf4dca3d5259a85d5219e31" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.147012 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562496-2k8gz"] Mar 17 12:16:00 crc kubenswrapper[4742]: E0317 12:16:00.147647 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" containerName="extract-content" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.147658 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" containerName="extract-content" Mar 17 12:16:00 crc kubenswrapper[4742]: E0317 12:16:00.147671 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2cca09-4d0a-4038-9136-fd982105b028" containerName="registry-server" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.147678 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2cca09-4d0a-4038-9136-fd982105b028" containerName="registry-server" Mar 17 12:16:00 crc kubenswrapper[4742]: E0317 12:16:00.147689 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2cca09-4d0a-4038-9136-fd982105b028" containerName="extract-utilities" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.147696 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2cca09-4d0a-4038-9136-fd982105b028" containerName="extract-utilities" Mar 17 12:16:00 crc kubenswrapper[4742]: E0317 12:16:00.147709 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" containerName="registry-server" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.147716 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" containerName="registry-server" Mar 17 12:16:00 crc kubenswrapper[4742]: E0317 12:16:00.147734 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2cca09-4d0a-4038-9136-fd982105b028" containerName="extract-content" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.147741 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2cca09-4d0a-4038-9136-fd982105b028" containerName="extract-content" Mar 17 12:16:00 crc kubenswrapper[4742]: E0317 12:16:00.147757 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" containerName="extract-utilities" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.147765 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" containerName="extract-utilities" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.148024 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2cca09-4d0a-4038-9136-fd982105b028" containerName="registry-server" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.148039 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="b600c3cc-d6a2-4ea6-a60f-76aaa957dca5" containerName="registry-server" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.148744 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562496-2k8gz" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.151876 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.152433 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.152491 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.170428 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562496-2k8gz"] Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.283539 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5slh\" (UniqueName: \"kubernetes.io/projected/d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58-kube-api-access-b5slh\") pod \"auto-csr-approver-29562496-2k8gz\" (UID: \"d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58\") " pod="openshift-infra/auto-csr-approver-29562496-2k8gz" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.385444 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5slh\" (UniqueName: \"kubernetes.io/projected/d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58-kube-api-access-b5slh\") pod \"auto-csr-approver-29562496-2k8gz\" (UID: \"d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58\") " pod="openshift-infra/auto-csr-approver-29562496-2k8gz" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.406463 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5slh\" (UniqueName: \"kubernetes.io/projected/d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58-kube-api-access-b5slh\") pod \"auto-csr-approver-29562496-2k8gz\" (UID: \"d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58\") " pod="openshift-infra/auto-csr-approver-29562496-2k8gz" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.473432 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562496-2k8gz" Mar 17 12:16:00 crc kubenswrapper[4742]: I0317 12:16:00.977630 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562496-2k8gz"] Mar 17 12:16:01 crc kubenswrapper[4742]: I0317 12:16:01.567653 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562496-2k8gz" event={"ID":"d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58","Type":"ContainerStarted","Data":"1a34b68d477e46173f5fc25763647e0eee4f5eb20f15d732d02eefd4b7a98082"} Mar 17 12:16:03 crc kubenswrapper[4742]: I0317 12:16:03.600784 4742 generic.go:334] "Generic (PLEG): container finished" podID="d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58" containerID="a55fafc0d043385648298663871f05f345dfcffd91cbd9080a7503bc6db8ce64" exitCode=0 Mar 17 12:16:03 crc kubenswrapper[4742]: I0317 12:16:03.601534 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562496-2k8gz" event={"ID":"d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58","Type":"ContainerDied","Data":"a55fafc0d043385648298663871f05f345dfcffd91cbd9080a7503bc6db8ce64"} Mar 17 12:16:05 crc kubenswrapper[4742]: I0317 12:16:05.085635 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562496-2k8gz" Mar 17 12:16:05 crc kubenswrapper[4742]: I0317 12:16:05.151597 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5slh\" (UniqueName: \"kubernetes.io/projected/d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58-kube-api-access-b5slh\") pod \"d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58\" (UID: \"d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58\") " Mar 17 12:16:05 crc kubenswrapper[4742]: I0317 12:16:05.162694 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58-kube-api-access-b5slh" (OuterVolumeSpecName: "kube-api-access-b5slh") pod "d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58" (UID: "d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58"). InnerVolumeSpecName "kube-api-access-b5slh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:16:05 crc kubenswrapper[4742]: I0317 12:16:05.254357 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5slh\" (UniqueName: \"kubernetes.io/projected/d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58-kube-api-access-b5slh\") on node \"crc\" DevicePath \"\"" Mar 17 12:16:05 crc kubenswrapper[4742]: I0317 12:16:05.632639 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562496-2k8gz" event={"ID":"d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58","Type":"ContainerDied","Data":"1a34b68d477e46173f5fc25763647e0eee4f5eb20f15d732d02eefd4b7a98082"} Mar 17 12:16:05 crc kubenswrapper[4742]: I0317 12:16:05.632708 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a34b68d477e46173f5fc25763647e0eee4f5eb20f15d732d02eefd4b7a98082" Mar 17 12:16:05 crc kubenswrapper[4742]: I0317 12:16:05.632749 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562496-2k8gz" Mar 17 12:16:06 crc kubenswrapper[4742]: I0317 12:16:06.183065 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562490-qm8k9"] Mar 17 12:16:06 crc kubenswrapper[4742]: I0317 12:16:06.194281 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562490-qm8k9"] Mar 17 12:16:06 crc kubenswrapper[4742]: I0317 12:16:06.677289 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1431b3-c664-40ff-971b-d336e91cc3c8" path="/var/lib/kubelet/pods/2a1431b3-c664-40ff-971b-d336e91cc3c8/volumes" Mar 17 12:16:09 crc kubenswrapper[4742]: I0317 12:16:09.663152 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:16:09 crc kubenswrapper[4742]: E0317 12:16:09.663759 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:16:23 crc kubenswrapper[4742]: I0317 12:16:23.663757 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:16:23 crc kubenswrapper[4742]: E0317 12:16:23.665203 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:16:34 crc kubenswrapper[4742]: I0317 12:16:34.664019 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:16:34 crc kubenswrapper[4742]: E0317 12:16:34.665019 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:16:45 crc kubenswrapper[4742]: I0317 12:16:45.663097 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:16:45 crc kubenswrapper[4742]: E0317 12:16:45.664127 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:16:56 crc kubenswrapper[4742]: I0317 12:16:56.238659 4742 generic.go:334] "Generic (PLEG): container finished" podID="0466a590-af75-4814-9161-b142a0a62674" containerID="c7472c9be7848bfef7e6a7577f50ce3a8d41e6cf0dc4bb17660210bf13db88e7" exitCode=0 Mar 17 12:16:56 crc kubenswrapper[4742]: I0317 12:16:56.238793 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mltbx/must-gather-4dhpl" event={"ID":"0466a590-af75-4814-9161-b142a0a62674","Type":"ContainerDied","Data":"c7472c9be7848bfef7e6a7577f50ce3a8d41e6cf0dc4bb17660210bf13db88e7"} Mar 17 12:16:56 crc kubenswrapper[4742]: I0317 12:16:56.240087 4742 scope.go:117] "RemoveContainer" containerID="c7472c9be7848bfef7e6a7577f50ce3a8d41e6cf0dc4bb17660210bf13db88e7" Mar 17 12:16:57 crc kubenswrapper[4742]: I0317 12:16:57.152356 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mltbx_must-gather-4dhpl_0466a590-af75-4814-9161-b142a0a62674/gather/0.log" Mar 17 12:16:57 crc kubenswrapper[4742]: I0317 12:16:57.822120 4742 scope.go:117] "RemoveContainer" containerID="8bc4783cbc6cf0e40f8359efe811d99c58b3e56f7b6aaff9ff4c505675166502" Mar 17 12:16:58 crc kubenswrapper[4742]: I0317 12:16:58.673893 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:16:58 crc kubenswrapper[4742]: E0317 12:16:58.674452 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:17:05 crc kubenswrapper[4742]: I0317 12:17:05.234892 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mltbx/must-gather-4dhpl"] Mar 17 12:17:05 crc kubenswrapper[4742]: I0317 12:17:05.235810 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-mltbx/must-gather-4dhpl" podUID="0466a590-af75-4814-9161-b142a0a62674" containerName="copy" containerID="cri-o://792c659793073ca4d0a8cfa615c1d573b2c8910eda00d37820f430d07c4bf6a8" gracePeriod=2 Mar 17 12:17:05 crc kubenswrapper[4742]: I0317 12:17:05.250102 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mltbx/must-gather-4dhpl"] Mar 17 12:17:05 crc kubenswrapper[4742]: I0317 12:17:05.664037 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mltbx_must-gather-4dhpl_0466a590-af75-4814-9161-b142a0a62674/copy/0.log" Mar 17 12:17:05 crc kubenswrapper[4742]: I0317 12:17:05.664746 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/must-gather-4dhpl" Mar 17 12:17:05 crc kubenswrapper[4742]: I0317 12:17:05.837443 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0466a590-af75-4814-9161-b142a0a62674-must-gather-output\") pod \"0466a590-af75-4814-9161-b142a0a62674\" (UID: \"0466a590-af75-4814-9161-b142a0a62674\") " Mar 17 12:17:05 crc kubenswrapper[4742]: I0317 12:17:05.837599 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn6pv\" (UniqueName: \"kubernetes.io/projected/0466a590-af75-4814-9161-b142a0a62674-kube-api-access-bn6pv\") pod \"0466a590-af75-4814-9161-b142a0a62674\" (UID: \"0466a590-af75-4814-9161-b142a0a62674\") " Mar 17 12:17:05 crc kubenswrapper[4742]: I0317 12:17:05.843589 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0466a590-af75-4814-9161-b142a0a62674-kube-api-access-bn6pv" (OuterVolumeSpecName: "kube-api-access-bn6pv") pod "0466a590-af75-4814-9161-b142a0a62674" (UID: "0466a590-af75-4814-9161-b142a0a62674"). InnerVolumeSpecName "kube-api-access-bn6pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:17:05 crc kubenswrapper[4742]: I0317 12:17:05.940613 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn6pv\" (UniqueName: \"kubernetes.io/projected/0466a590-af75-4814-9161-b142a0a62674-kube-api-access-bn6pv\") on node \"crc\" DevicePath \"\"" Mar 17 12:17:06 crc kubenswrapper[4742]: I0317 12:17:06.028777 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0466a590-af75-4814-9161-b142a0a62674-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0466a590-af75-4814-9161-b142a0a62674" (UID: "0466a590-af75-4814-9161-b142a0a62674"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:17:06 crc kubenswrapper[4742]: I0317 12:17:06.042663 4742 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0466a590-af75-4814-9161-b142a0a62674-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 17 12:17:06 crc kubenswrapper[4742]: I0317 12:17:06.359626 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mltbx_must-gather-4dhpl_0466a590-af75-4814-9161-b142a0a62674/copy/0.log" Mar 17 12:17:06 crc kubenswrapper[4742]: I0317 12:17:06.360051 4742 generic.go:334] "Generic (PLEG): container finished" podID="0466a590-af75-4814-9161-b142a0a62674" containerID="792c659793073ca4d0a8cfa615c1d573b2c8910eda00d37820f430d07c4bf6a8" exitCode=143 Mar 17 12:17:06 crc kubenswrapper[4742]: I0317 12:17:06.360102 4742 scope.go:117] "RemoveContainer" containerID="792c659793073ca4d0a8cfa615c1d573b2c8910eda00d37820f430d07c4bf6a8" Mar 17 12:17:06 crc kubenswrapper[4742]: I0317 12:17:06.360126 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mltbx/must-gather-4dhpl" Mar 17 12:17:06 crc kubenswrapper[4742]: I0317 12:17:06.386287 4742 scope.go:117] "RemoveContainer" containerID="c7472c9be7848bfef7e6a7577f50ce3a8d41e6cf0dc4bb17660210bf13db88e7" Mar 17 12:17:06 crc kubenswrapper[4742]: I0317 12:17:06.693033 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0466a590-af75-4814-9161-b142a0a62674" path="/var/lib/kubelet/pods/0466a590-af75-4814-9161-b142a0a62674/volumes" Mar 17 12:17:06 crc kubenswrapper[4742]: I0317 12:17:06.776002 4742 scope.go:117] "RemoveContainer" containerID="792c659793073ca4d0a8cfa615c1d573b2c8910eda00d37820f430d07c4bf6a8" Mar 17 12:17:06 crc kubenswrapper[4742]: E0317 12:17:06.776382 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"792c659793073ca4d0a8cfa615c1d573b2c8910eda00d37820f430d07c4bf6a8\": container with ID starting with 792c659793073ca4d0a8cfa615c1d573b2c8910eda00d37820f430d07c4bf6a8 not found: ID does not exist" containerID="792c659793073ca4d0a8cfa615c1d573b2c8910eda00d37820f430d07c4bf6a8" Mar 17 12:17:06 crc kubenswrapper[4742]: I0317 12:17:06.776418 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792c659793073ca4d0a8cfa615c1d573b2c8910eda00d37820f430d07c4bf6a8"} err="failed to get container status \"792c659793073ca4d0a8cfa615c1d573b2c8910eda00d37820f430d07c4bf6a8\": rpc error: code = NotFound desc = could not find container \"792c659793073ca4d0a8cfa615c1d573b2c8910eda00d37820f430d07c4bf6a8\": container with ID starting with 792c659793073ca4d0a8cfa615c1d573b2c8910eda00d37820f430d07c4bf6a8 not found: ID does not exist" Mar 17 12:17:06 crc kubenswrapper[4742]: I0317 12:17:06.776436 4742 scope.go:117] "RemoveContainer" containerID="c7472c9be7848bfef7e6a7577f50ce3a8d41e6cf0dc4bb17660210bf13db88e7" Mar 17 12:17:06 crc kubenswrapper[4742]: E0317 12:17:06.776718 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7472c9be7848bfef7e6a7577f50ce3a8d41e6cf0dc4bb17660210bf13db88e7\": container with ID starting with c7472c9be7848bfef7e6a7577f50ce3a8d41e6cf0dc4bb17660210bf13db88e7 not found: ID does not exist" containerID="c7472c9be7848bfef7e6a7577f50ce3a8d41e6cf0dc4bb17660210bf13db88e7" Mar 17 12:17:06 crc kubenswrapper[4742]: I0317 12:17:06.776743 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7472c9be7848bfef7e6a7577f50ce3a8d41e6cf0dc4bb17660210bf13db88e7"} err="failed to get container status \"c7472c9be7848bfef7e6a7577f50ce3a8d41e6cf0dc4bb17660210bf13db88e7\": rpc error: code = NotFound desc = could not find container \"c7472c9be7848bfef7e6a7577f50ce3a8d41e6cf0dc4bb17660210bf13db88e7\": container with ID starting with c7472c9be7848bfef7e6a7577f50ce3a8d41e6cf0dc4bb17660210bf13db88e7 not found: ID does not exist" Mar 17 12:17:11 crc kubenswrapper[4742]: I0317 12:17:11.664807 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:17:11 crc kubenswrapper[4742]: E0317 12:17:11.665815 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:17:23 crc kubenswrapper[4742]: I0317 12:17:23.662875 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:17:24 crc kubenswrapper[4742]: I0317 12:17:24.572613 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"b4c9cacb2ad65768276b5012b7d6a56bc72a471d174cfcf7c4bf8a60597c5822"} Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.149895 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562498-fnkmh"] Mar 17 12:18:00 crc kubenswrapper[4742]: E0317 12:18:00.151062 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0466a590-af75-4814-9161-b142a0a62674" containerName="copy" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.151082 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0466a590-af75-4814-9161-b142a0a62674" containerName="copy" Mar 17 12:18:00 crc kubenswrapper[4742]: E0317 12:18:00.151108 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0466a590-af75-4814-9161-b142a0a62674" containerName="gather" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.151116 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0466a590-af75-4814-9161-b142a0a62674" containerName="gather" Mar 17 12:18:00 crc kubenswrapper[4742]: E0317 12:18:00.151147 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58" containerName="oc" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.151155 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58" containerName="oc" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.151379 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="0466a590-af75-4814-9161-b142a0a62674" containerName="gather" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.151401 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="0466a590-af75-4814-9161-b142a0a62674" containerName="copy" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.151415 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58" containerName="oc" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.152275 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562498-fnkmh" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.155267 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.155288 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.155318 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.171507 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562498-fnkmh"] Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.252805 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spk5q\" (UniqueName: \"kubernetes.io/projected/6a80dc9c-7f4a-48a4-9496-ef8edc764a15-kube-api-access-spk5q\") pod \"auto-csr-approver-29562498-fnkmh\" (UID: \"6a80dc9c-7f4a-48a4-9496-ef8edc764a15\") " pod="openshift-infra/auto-csr-approver-29562498-fnkmh" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.355201 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spk5q\" (UniqueName: \"kubernetes.io/projected/6a80dc9c-7f4a-48a4-9496-ef8edc764a15-kube-api-access-spk5q\") pod \"auto-csr-approver-29562498-fnkmh\" (UID: \"6a80dc9c-7f4a-48a4-9496-ef8edc764a15\") " pod="openshift-infra/auto-csr-approver-29562498-fnkmh" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.387479 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spk5q\" (UniqueName: \"kubernetes.io/projected/6a80dc9c-7f4a-48a4-9496-ef8edc764a15-kube-api-access-spk5q\") pod \"auto-csr-approver-29562498-fnkmh\" (UID: \"6a80dc9c-7f4a-48a4-9496-ef8edc764a15\") " pod="openshift-infra/auto-csr-approver-29562498-fnkmh" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.490853 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562498-fnkmh" Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.951984 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562498-fnkmh"] Mar 17 12:18:00 crc kubenswrapper[4742]: I0317 12:18:00.982343 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562498-fnkmh" event={"ID":"6a80dc9c-7f4a-48a4-9496-ef8edc764a15","Type":"ContainerStarted","Data":"495918672aff81952714290b4adcc94c0a93bb59f26e9df1a5bfa007d10e7b3a"} Mar 17 12:18:03 crc kubenswrapper[4742]: I0317 12:18:03.003108 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562498-fnkmh" event={"ID":"6a80dc9c-7f4a-48a4-9496-ef8edc764a15","Type":"ContainerStarted","Data":"c8de0bdb1a4b867408bb92d47c1094c836a9752e1024f8b322cb9415d4eb6fb1"} Mar 17 12:18:03 crc kubenswrapper[4742]: I0317 12:18:03.025156 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562498-fnkmh" podStartSLOduration=1.352483277 podStartE2EDuration="3.025139488s" podCreationTimestamp="2026-03-17 12:18:00 +0000 UTC" firstStartedPulling="2026-03-17 12:18:00.956163618 +0000 UTC m=+3984.082291376" lastFinishedPulling="2026-03-17 12:18:02.628819829 +0000 UTC m=+3985.754947587" observedRunningTime="2026-03-17 12:18:03.017188919 +0000 UTC m=+3986.143316677" watchObservedRunningTime="2026-03-17 12:18:03.025139488 +0000 UTC m=+3986.151267246" Mar 17 12:18:04 crc kubenswrapper[4742]: I0317 12:18:04.020451 4742 generic.go:334] "Generic (PLEG): container finished" podID="6a80dc9c-7f4a-48a4-9496-ef8edc764a15" containerID="c8de0bdb1a4b867408bb92d47c1094c836a9752e1024f8b322cb9415d4eb6fb1" exitCode=0 Mar 17 12:18:04 crc kubenswrapper[4742]: I0317 12:18:04.020520 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562498-fnkmh" event={"ID":"6a80dc9c-7f4a-48a4-9496-ef8edc764a15","Type":"ContainerDied","Data":"c8de0bdb1a4b867408bb92d47c1094c836a9752e1024f8b322cb9415d4eb6fb1"} Mar 17 12:18:05 crc kubenswrapper[4742]: I0317 12:18:05.481834 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562498-fnkmh" Mar 17 12:18:05 crc kubenswrapper[4742]: I0317 12:18:05.609651 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spk5q\" (UniqueName: \"kubernetes.io/projected/6a80dc9c-7f4a-48a4-9496-ef8edc764a15-kube-api-access-spk5q\") pod \"6a80dc9c-7f4a-48a4-9496-ef8edc764a15\" (UID: \"6a80dc9c-7f4a-48a4-9496-ef8edc764a15\") " Mar 17 12:18:05 crc kubenswrapper[4742]: I0317 12:18:05.619408 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a80dc9c-7f4a-48a4-9496-ef8edc764a15-kube-api-access-spk5q" (OuterVolumeSpecName: "kube-api-access-spk5q") pod "6a80dc9c-7f4a-48a4-9496-ef8edc764a15" (UID: "6a80dc9c-7f4a-48a4-9496-ef8edc764a15"). InnerVolumeSpecName "kube-api-access-spk5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:18:05 crc kubenswrapper[4742]: I0317 12:18:05.713146 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spk5q\" (UniqueName: \"kubernetes.io/projected/6a80dc9c-7f4a-48a4-9496-ef8edc764a15-kube-api-access-spk5q\") on node \"crc\" DevicePath \"\"" Mar 17 12:18:06 crc kubenswrapper[4742]: I0317 12:18:06.049763 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562498-fnkmh" event={"ID":"6a80dc9c-7f4a-48a4-9496-ef8edc764a15","Type":"ContainerDied","Data":"495918672aff81952714290b4adcc94c0a93bb59f26e9df1a5bfa007d10e7b3a"} Mar 17 12:18:06 crc kubenswrapper[4742]: I0317 12:18:06.049825 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="495918672aff81952714290b4adcc94c0a93bb59f26e9df1a5bfa007d10e7b3a" Mar 17 12:18:06 crc kubenswrapper[4742]: I0317 12:18:06.049900 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562498-fnkmh" Mar 17 12:18:06 crc kubenswrapper[4742]: I0317 12:18:06.104832 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562492-cvvvl"] Mar 17 12:18:06 crc kubenswrapper[4742]: I0317 12:18:06.115941 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562492-cvvvl"] Mar 17 12:18:06 crc kubenswrapper[4742]: I0317 12:18:06.678711 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c511e2-9f3d-45fd-a671-37a745506f9b" path="/var/lib/kubelet/pods/21c511e2-9f3d-45fd-a671-37a745506f9b/volumes" Mar 17 12:18:57 crc kubenswrapper[4742]: I0317 12:18:57.969019 4742 scope.go:117] "RemoveContainer" containerID="b79317541ea8ef8e901e52b58f90577e9725970beec390528d1058e09e7700d8" Mar 17 12:19:48 crc kubenswrapper[4742]: I0317 12:19:48.043997 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:19:48 crc kubenswrapper[4742]: I0317 12:19:48.044559 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:20:00 crc kubenswrapper[4742]: I0317 12:20:00.171387 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562500-mpzhw"] Mar 17 12:20:00 crc kubenswrapper[4742]: E0317 12:20:00.172542 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a80dc9c-7f4a-48a4-9496-ef8edc764a15" containerName="oc" Mar 17 12:20:00 crc kubenswrapper[4742]: I0317 12:20:00.172564 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a80dc9c-7f4a-48a4-9496-ef8edc764a15" containerName="oc" Mar 17 12:20:00 crc kubenswrapper[4742]: I0317 12:20:00.172949 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a80dc9c-7f4a-48a4-9496-ef8edc764a15" containerName="oc" Mar 17 12:20:00 crc kubenswrapper[4742]: I0317 12:20:00.174070 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562500-mpzhw" Mar 17 12:20:00 crc kubenswrapper[4742]: I0317 12:20:00.177007 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:20:00 crc kubenswrapper[4742]: I0317 12:20:00.177218 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:20:00 crc kubenswrapper[4742]: I0317 12:20:00.177559 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:20:00 crc kubenswrapper[4742]: I0317 12:20:00.204408 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562500-mpzhw"] Mar 17 12:20:00 crc kubenswrapper[4742]: I0317 12:20:00.329512 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnnwl\" (UniqueName: \"kubernetes.io/projected/0252c4c4-de55-42c1-97c0-34bc9d6ea579-kube-api-access-xnnwl\") pod \"auto-csr-approver-29562500-mpzhw\" (UID: \"0252c4c4-de55-42c1-97c0-34bc9d6ea579\") " pod="openshift-infra/auto-csr-approver-29562500-mpzhw" Mar 17 12:20:00 crc kubenswrapper[4742]: I0317 12:20:00.432068 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnnwl\" (UniqueName: \"kubernetes.io/projected/0252c4c4-de55-42c1-97c0-34bc9d6ea579-kube-api-access-xnnwl\") pod \"auto-csr-approver-29562500-mpzhw\" (UID: \"0252c4c4-de55-42c1-97c0-34bc9d6ea579\") " pod="openshift-infra/auto-csr-approver-29562500-mpzhw" Mar 17 12:20:00 crc kubenswrapper[4742]: I0317 12:20:00.453431 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnnwl\" (UniqueName: \"kubernetes.io/projected/0252c4c4-de55-42c1-97c0-34bc9d6ea579-kube-api-access-xnnwl\") pod \"auto-csr-approver-29562500-mpzhw\" (UID: \"0252c4c4-de55-42c1-97c0-34bc9d6ea579\") " pod="openshift-infra/auto-csr-approver-29562500-mpzhw" Mar 17 12:20:00 crc kubenswrapper[4742]: I0317 12:20:00.501514 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562500-mpzhw" Mar 17 12:20:00 crc kubenswrapper[4742]: I0317 12:20:00.955990 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562500-mpzhw"] Mar 17 12:20:01 crc kubenswrapper[4742]: I0317 12:20:01.652330 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562500-mpzhw" event={"ID":"0252c4c4-de55-42c1-97c0-34bc9d6ea579","Type":"ContainerStarted","Data":"21af5c8bb2ed3ac6d67c1d81742a20416e3448ae3578278ada70ca349e54ec45"} Mar 17 12:20:02 crc kubenswrapper[4742]: I0317 12:20:02.671184 4742 generic.go:334] "Generic (PLEG): container finished" podID="0252c4c4-de55-42c1-97c0-34bc9d6ea579" containerID="3c8d221bed280fb0f266eb63f78a56d3a62938da5d175f727e0c935d0b562039" exitCode=0 Mar 17 12:20:02 crc kubenswrapper[4742]: I0317 12:20:02.684209 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562500-mpzhw" event={"ID":"0252c4c4-de55-42c1-97c0-34bc9d6ea579","Type":"ContainerDied","Data":"3c8d221bed280fb0f266eb63f78a56d3a62938da5d175f727e0c935d0b562039"} Mar 17 12:20:04 crc kubenswrapper[4742]: I0317 12:20:04.090479 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562500-mpzhw" Mar 17 12:20:04 crc kubenswrapper[4742]: I0317 12:20:04.210153 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnnwl\" (UniqueName: \"kubernetes.io/projected/0252c4c4-de55-42c1-97c0-34bc9d6ea579-kube-api-access-xnnwl\") pod \"0252c4c4-de55-42c1-97c0-34bc9d6ea579\" (UID: \"0252c4c4-de55-42c1-97c0-34bc9d6ea579\") " Mar 17 12:20:04 crc kubenswrapper[4742]: I0317 12:20:04.216166 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0252c4c4-de55-42c1-97c0-34bc9d6ea579-kube-api-access-xnnwl" (OuterVolumeSpecName: "kube-api-access-xnnwl") pod "0252c4c4-de55-42c1-97c0-34bc9d6ea579" (UID: "0252c4c4-de55-42c1-97c0-34bc9d6ea579"). InnerVolumeSpecName "kube-api-access-xnnwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:20:04 crc kubenswrapper[4742]: I0317 12:20:04.312791 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnnwl\" (UniqueName: \"kubernetes.io/projected/0252c4c4-de55-42c1-97c0-34bc9d6ea579-kube-api-access-xnnwl\") on node \"crc\" DevicePath \"\"" Mar 17 12:20:04 crc kubenswrapper[4742]: I0317 12:20:04.698801 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562500-mpzhw" event={"ID":"0252c4c4-de55-42c1-97c0-34bc9d6ea579","Type":"ContainerDied","Data":"21af5c8bb2ed3ac6d67c1d81742a20416e3448ae3578278ada70ca349e54ec45"} Mar 17 12:20:04 crc kubenswrapper[4742]: I0317 12:20:04.699165 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21af5c8bb2ed3ac6d67c1d81742a20416e3448ae3578278ada70ca349e54ec45" Mar 17 12:20:04 crc kubenswrapper[4742]: I0317 12:20:04.698884 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562500-mpzhw" Mar 17 12:20:05 crc kubenswrapper[4742]: I0317 12:20:05.168584 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562494-684q5"] Mar 17 12:20:05 crc kubenswrapper[4742]: I0317 12:20:05.183006 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562494-684q5"] Mar 17 12:20:06 crc kubenswrapper[4742]: I0317 12:20:06.681077 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea9d544d-eba4-4212-bfc5-7aab8dc25615" path="/var/lib/kubelet/pods/ea9d544d-eba4-4212-bfc5-7aab8dc25615/volumes" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.211753 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rcph6/must-gather-nvl7j"] Mar 17 12:20:15 crc kubenswrapper[4742]: E0317 12:20:15.212621 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0252c4c4-de55-42c1-97c0-34bc9d6ea579" containerName="oc" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.212636 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="0252c4c4-de55-42c1-97c0-34bc9d6ea579" containerName="oc" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.212817 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="0252c4c4-de55-42c1-97c0-34bc9d6ea579" containerName="oc" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.213970 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/must-gather-nvl7j" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.215878 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rcph6"/"openshift-service-ca.crt" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.217353 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rcph6"/"default-dockercfg-n8wgc" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.218644 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rcph6"/"kube-root-ca.crt" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.227011 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rcph6/must-gather-nvl7j"] Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.303961 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09d769ba-43cf-4abc-aec6-f21879cc4c38-must-gather-output\") pod \"must-gather-nvl7j\" (UID: \"09d769ba-43cf-4abc-aec6-f21879cc4c38\") " pod="openshift-must-gather-rcph6/must-gather-nvl7j" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.304049 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fghvc\" (UniqueName: \"kubernetes.io/projected/09d769ba-43cf-4abc-aec6-f21879cc4c38-kube-api-access-fghvc\") pod \"must-gather-nvl7j\" (UID: \"09d769ba-43cf-4abc-aec6-f21879cc4c38\") " pod="openshift-must-gather-rcph6/must-gather-nvl7j" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.406507 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09d769ba-43cf-4abc-aec6-f21879cc4c38-must-gather-output\") pod \"must-gather-nvl7j\" (UID: \"09d769ba-43cf-4abc-aec6-f21879cc4c38\") " pod="openshift-must-gather-rcph6/must-gather-nvl7j" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.406627 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fghvc\" (UniqueName: \"kubernetes.io/projected/09d769ba-43cf-4abc-aec6-f21879cc4c38-kube-api-access-fghvc\") pod \"must-gather-nvl7j\" (UID: \"09d769ba-43cf-4abc-aec6-f21879cc4c38\") " pod="openshift-must-gather-rcph6/must-gather-nvl7j" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.407099 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09d769ba-43cf-4abc-aec6-f21879cc4c38-must-gather-output\") pod \"must-gather-nvl7j\" (UID: \"09d769ba-43cf-4abc-aec6-f21879cc4c38\") " pod="openshift-must-gather-rcph6/must-gather-nvl7j" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.434616 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fghvc\" (UniqueName: \"kubernetes.io/projected/09d769ba-43cf-4abc-aec6-f21879cc4c38-kube-api-access-fghvc\") pod \"must-gather-nvl7j\" (UID: \"09d769ba-43cf-4abc-aec6-f21879cc4c38\") " pod="openshift-must-gather-rcph6/must-gather-nvl7j" Mar 17 12:20:15 crc kubenswrapper[4742]: I0317 12:20:15.531274 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/must-gather-nvl7j" Mar 17 12:20:16 crc kubenswrapper[4742]: I0317 12:20:16.050228 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rcph6/must-gather-nvl7j"] Mar 17 12:20:16 crc kubenswrapper[4742]: I0317 12:20:16.871327 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcph6/must-gather-nvl7j" event={"ID":"09d769ba-43cf-4abc-aec6-f21879cc4c38","Type":"ContainerStarted","Data":"fe2544bc03d670fc74002cffe9750fc2f05ac02966dd14a2e4670387d3d9ccbf"} Mar 17 12:20:16 crc kubenswrapper[4742]: I0317 12:20:16.871706 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcph6/must-gather-nvl7j" event={"ID":"09d769ba-43cf-4abc-aec6-f21879cc4c38","Type":"ContainerStarted","Data":"2b007bf8242742771d437d12d0e69b64e5f28004bccd213e799b8a450c45389d"} Mar 17 12:20:16 crc kubenswrapper[4742]: I0317 12:20:16.871743 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcph6/must-gather-nvl7j" event={"ID":"09d769ba-43cf-4abc-aec6-f21879cc4c38","Type":"ContainerStarted","Data":"b9d36a510d9ca8abfbffb07e6375f60e8953a9fc42d433309cc8d020ea4bacad"} Mar 17 12:20:16 crc kubenswrapper[4742]: I0317 12:20:16.906963 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rcph6/must-gather-nvl7j" podStartSLOduration=1.906936728 podStartE2EDuration="1.906936728s" podCreationTimestamp="2026-03-17 12:20:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 12:20:16.893444777 +0000 UTC m=+4120.019572595" watchObservedRunningTime="2026-03-17 12:20:16.906936728 +0000 UTC m=+4120.033064526" Mar 17 12:20:18 crc kubenswrapper[4742]: I0317 12:20:18.044139 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:20:18 crc kubenswrapper[4742]: I0317 12:20:18.044422 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:20:19 crc kubenswrapper[4742]: I0317 12:20:19.775109 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rcph6/crc-debug-c99ct"] Mar 17 12:20:19 crc kubenswrapper[4742]: I0317 12:20:19.777076 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/crc-debug-c99ct" Mar 17 12:20:19 crc kubenswrapper[4742]: I0317 12:20:19.892811 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk9kf\" (UniqueName: \"kubernetes.io/projected/7da57f47-676b-4be4-aa84-96b4c55e555b-kube-api-access-kk9kf\") pod \"crc-debug-c99ct\" (UID: \"7da57f47-676b-4be4-aa84-96b4c55e555b\") " pod="openshift-must-gather-rcph6/crc-debug-c99ct" Mar 17 12:20:19 crc kubenswrapper[4742]: I0317 12:20:19.893089 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7da57f47-676b-4be4-aa84-96b4c55e555b-host\") pod \"crc-debug-c99ct\" (UID: \"7da57f47-676b-4be4-aa84-96b4c55e555b\") " pod="openshift-must-gather-rcph6/crc-debug-c99ct" Mar 17 12:20:19 crc kubenswrapper[4742]: I0317 12:20:19.994686 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk9kf\" (UniqueName: \"kubernetes.io/projected/7da57f47-676b-4be4-aa84-96b4c55e555b-kube-api-access-kk9kf\") pod \"crc-debug-c99ct\" (UID: \"7da57f47-676b-4be4-aa84-96b4c55e555b\") " pod="openshift-must-gather-rcph6/crc-debug-c99ct" Mar 17 12:20:19 crc kubenswrapper[4742]: I0317 12:20:19.994729 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7da57f47-676b-4be4-aa84-96b4c55e555b-host\") pod \"crc-debug-c99ct\" (UID: \"7da57f47-676b-4be4-aa84-96b4c55e555b\") " pod="openshift-must-gather-rcph6/crc-debug-c99ct" Mar 17 12:20:19 crc kubenswrapper[4742]: I0317 12:20:19.994885 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7da57f47-676b-4be4-aa84-96b4c55e555b-host\") pod \"crc-debug-c99ct\" (UID: \"7da57f47-676b-4be4-aa84-96b4c55e555b\") " pod="openshift-must-gather-rcph6/crc-debug-c99ct" Mar 17 12:20:20 crc kubenswrapper[4742]: I0317 12:20:20.013629 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk9kf\" (UniqueName: \"kubernetes.io/projected/7da57f47-676b-4be4-aa84-96b4c55e555b-kube-api-access-kk9kf\") pod \"crc-debug-c99ct\" (UID: \"7da57f47-676b-4be4-aa84-96b4c55e555b\") " pod="openshift-must-gather-rcph6/crc-debug-c99ct" Mar 17 12:20:20 crc kubenswrapper[4742]: I0317 12:20:20.099361 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/crc-debug-c99ct" Mar 17 12:20:20 crc kubenswrapper[4742]: W0317 12:20:20.127482 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7da57f47_676b_4be4_aa84_96b4c55e555b.slice/crio-930d53fd62906645754a2845e4bfed1a80bcef5f5a46f38f88bf892237e1ffdd WatchSource:0}: Error finding container 930d53fd62906645754a2845e4bfed1a80bcef5f5a46f38f88bf892237e1ffdd: Status 404 returned error can't find the container with id 930d53fd62906645754a2845e4bfed1a80bcef5f5a46f38f88bf892237e1ffdd Mar 17 12:20:20 crc kubenswrapper[4742]: I0317 12:20:20.925366 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcph6/crc-debug-c99ct" event={"ID":"7da57f47-676b-4be4-aa84-96b4c55e555b","Type":"ContainerStarted","Data":"2f7b501fee989bcec84dd1b6d910b523354603980aae8d86706aab0522b31a74"} Mar 17 12:20:20 crc kubenswrapper[4742]: I0317 12:20:20.925883 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcph6/crc-debug-c99ct" event={"ID":"7da57f47-676b-4be4-aa84-96b4c55e555b","Type":"ContainerStarted","Data":"930d53fd62906645754a2845e4bfed1a80bcef5f5a46f38f88bf892237e1ffdd"} Mar 17 12:20:27 crc kubenswrapper[4742]: I0317 12:20:27.952035 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rcph6/crc-debug-c99ct" podStartSLOduration=8.952017661 podStartE2EDuration="8.952017661s" podCreationTimestamp="2026-03-17 12:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-17 12:20:20.940360803 +0000 UTC m=+4124.066488561" watchObservedRunningTime="2026-03-17 12:20:27.952017661 +0000 UTC m=+4131.078145419" Mar 17 12:20:27 crc kubenswrapper[4742]: I0317 12:20:27.966583 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xqrwr"] Mar 17 12:20:27 crc kubenswrapper[4742]: I0317 12:20:27.980644 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:27 crc kubenswrapper[4742]: I0317 12:20:27.997675 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xqrwr"] Mar 17 12:20:28 crc kubenswrapper[4742]: I0317 12:20:28.087090 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2345c7c-e927-4754-af27-9c836794d9c8-catalog-content\") pod \"redhat-operators-xqrwr\" (UID: \"b2345c7c-e927-4754-af27-9c836794d9c8\") " pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:28 crc kubenswrapper[4742]: I0317 12:20:28.087267 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2dd2\" (UniqueName: \"kubernetes.io/projected/b2345c7c-e927-4754-af27-9c836794d9c8-kube-api-access-d2dd2\") pod \"redhat-operators-xqrwr\" (UID: \"b2345c7c-e927-4754-af27-9c836794d9c8\") " pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:28 crc kubenswrapper[4742]: I0317 12:20:28.087363 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2345c7c-e927-4754-af27-9c836794d9c8-utilities\") pod \"redhat-operators-xqrwr\" (UID: \"b2345c7c-e927-4754-af27-9c836794d9c8\") " pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:28 crc kubenswrapper[4742]: I0317 12:20:28.189494 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2345c7c-e927-4754-af27-9c836794d9c8-utilities\") pod \"redhat-operators-xqrwr\" (UID: \"b2345c7c-e927-4754-af27-9c836794d9c8\") " pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:28 crc kubenswrapper[4742]: I0317 12:20:28.189608 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2345c7c-e927-4754-af27-9c836794d9c8-catalog-content\") pod \"redhat-operators-xqrwr\" (UID: \"b2345c7c-e927-4754-af27-9c836794d9c8\") " pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:28 crc kubenswrapper[4742]: I0317 12:20:28.189725 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2dd2\" (UniqueName: \"kubernetes.io/projected/b2345c7c-e927-4754-af27-9c836794d9c8-kube-api-access-d2dd2\") pod \"redhat-operators-xqrwr\" (UID: \"b2345c7c-e927-4754-af27-9c836794d9c8\") " pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:28 crc kubenswrapper[4742]: I0317 12:20:28.190017 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2345c7c-e927-4754-af27-9c836794d9c8-utilities\") pod \"redhat-operators-xqrwr\" (UID: \"b2345c7c-e927-4754-af27-9c836794d9c8\") " pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:28 crc kubenswrapper[4742]: I0317 12:20:28.190053 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2345c7c-e927-4754-af27-9c836794d9c8-catalog-content\") pod \"redhat-operators-xqrwr\" (UID: \"b2345c7c-e927-4754-af27-9c836794d9c8\") " pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:28 crc kubenswrapper[4742]: I0317 12:20:28.216731 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2dd2\" (UniqueName: \"kubernetes.io/projected/b2345c7c-e927-4754-af27-9c836794d9c8-kube-api-access-d2dd2\") pod \"redhat-operators-xqrwr\" (UID: \"b2345c7c-e927-4754-af27-9c836794d9c8\") " pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:28 crc kubenswrapper[4742]: I0317 12:20:28.355814 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:29 crc kubenswrapper[4742]: I0317 12:20:29.021539 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xqrwr"] Mar 17 12:20:30 crc kubenswrapper[4742]: I0317 12:20:30.023341 4742 generic.go:334] "Generic (PLEG): container finished" podID="b2345c7c-e927-4754-af27-9c836794d9c8" containerID="37243bd381b3c5a538b474a1921b39650e9ac6f3db5861f43db86b841c33bf5f" exitCode=0 Mar 17 12:20:30 crc kubenswrapper[4742]: I0317 12:20:30.023402 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqrwr" event={"ID":"b2345c7c-e927-4754-af27-9c836794d9c8","Type":"ContainerDied","Data":"37243bd381b3c5a538b474a1921b39650e9ac6f3db5861f43db86b841c33bf5f"} Mar 17 12:20:30 crc kubenswrapper[4742]: I0317 12:20:30.023993 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqrwr" event={"ID":"b2345c7c-e927-4754-af27-9c836794d9c8","Type":"ContainerStarted","Data":"5a42c75c166337319d6e1f0ebe2a4c6e3049a04a70734fb93c876d57d0b6cd5a"} Mar 17 12:20:42 crc kubenswrapper[4742]: I0317 12:20:42.142455 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqrwr" event={"ID":"b2345c7c-e927-4754-af27-9c836794d9c8","Type":"ContainerStarted","Data":"24c8ba724df18e5b969127f68f813c9654ac11b4392e7fa2afefdfe8601278a9"} Mar 17 12:20:44 crc kubenswrapper[4742]: I0317 12:20:44.165724 4742 generic.go:334] "Generic (PLEG): container finished" podID="b2345c7c-e927-4754-af27-9c836794d9c8" containerID="24c8ba724df18e5b969127f68f813c9654ac11b4392e7fa2afefdfe8601278a9" exitCode=0 Mar 17 12:20:44 crc kubenswrapper[4742]: I0317 12:20:44.165877 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqrwr" event={"ID":"b2345c7c-e927-4754-af27-9c836794d9c8","Type":"ContainerDied","Data":"24c8ba724df18e5b969127f68f813c9654ac11b4392e7fa2afefdfe8601278a9"} Mar 17 12:20:44 crc kubenswrapper[4742]: I0317 12:20:44.170137 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 12:20:46 crc kubenswrapper[4742]: I0317 12:20:46.189026 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqrwr" event={"ID":"b2345c7c-e927-4754-af27-9c836794d9c8","Type":"ContainerStarted","Data":"74cd2f4ff0ec041596cdfcdacf1a3f34565972dbce9166edf6f6d280b439d1fd"} Mar 17 12:20:46 crc kubenswrapper[4742]: I0317 12:20:46.216847 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xqrwr" podStartSLOduration=4.502100373 podStartE2EDuration="19.216823964s" podCreationTimestamp="2026-03-17 12:20:27 +0000 UTC" firstStartedPulling="2026-03-17 12:20:30.025942167 +0000 UTC m=+4133.152069925" lastFinishedPulling="2026-03-17 12:20:44.740665758 +0000 UTC m=+4147.866793516" observedRunningTime="2026-03-17 12:20:46.208921188 +0000 UTC m=+4149.335048946" watchObservedRunningTime="2026-03-17 12:20:46.216823964 +0000 UTC m=+4149.342951722" Mar 17 12:20:48 crc kubenswrapper[4742]: I0317 12:20:48.044063 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:20:48 crc kubenswrapper[4742]: I0317 12:20:48.044480 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:20:48 crc kubenswrapper[4742]: I0317 12:20:48.044549 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 12:20:48 crc kubenswrapper[4742]: I0317 12:20:48.045478 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4c9cacb2ad65768276b5012b7d6a56bc72a471d174cfcf7c4bf8a60597c5822"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 12:20:48 crc kubenswrapper[4742]: I0317 12:20:48.045560 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://b4c9cacb2ad65768276b5012b7d6a56bc72a471d174cfcf7c4bf8a60597c5822" gracePeriod=600 Mar 17 12:20:48 crc kubenswrapper[4742]: I0317 12:20:48.356756 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:48 crc kubenswrapper[4742]: I0317 12:20:48.356811 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:49 crc kubenswrapper[4742]: I0317 12:20:49.211147 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="b4c9cacb2ad65768276b5012b7d6a56bc72a471d174cfcf7c4bf8a60597c5822" exitCode=0 Mar 17 12:20:49 crc kubenswrapper[4742]: I0317 12:20:49.211231 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"b4c9cacb2ad65768276b5012b7d6a56bc72a471d174cfcf7c4bf8a60597c5822"} Mar 17 12:20:49 crc kubenswrapper[4742]: I0317 12:20:49.211510 4742 scope.go:117] "RemoveContainer" containerID="6b66728f8d8626930ff2aea23971ded818d66ea306ee15eeb07844a5e6e63b1f" Mar 17 12:20:49 crc kubenswrapper[4742]: I0317 12:20:49.416919 4742 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xqrwr" podUID="b2345c7c-e927-4754-af27-9c836794d9c8" containerName="registry-server" probeResult="failure" output=< Mar 17 12:20:49 crc kubenswrapper[4742]: timeout: failed to connect service ":50051" within 1s Mar 17 12:20:49 crc kubenswrapper[4742]: > Mar 17 12:20:51 crc kubenswrapper[4742]: I0317 12:20:51.230578 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0"} Mar 17 12:20:55 crc kubenswrapper[4742]: I0317 12:20:55.263596 4742 generic.go:334] "Generic (PLEG): container finished" podID="7da57f47-676b-4be4-aa84-96b4c55e555b" containerID="2f7b501fee989bcec84dd1b6d910b523354603980aae8d86706aab0522b31a74" exitCode=0 Mar 17 12:20:55 crc kubenswrapper[4742]: I0317 12:20:55.263683 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcph6/crc-debug-c99ct" event={"ID":"7da57f47-676b-4be4-aa84-96b4c55e555b","Type":"ContainerDied","Data":"2f7b501fee989bcec84dd1b6d910b523354603980aae8d86706aab0522b31a74"} Mar 17 12:20:56 crc kubenswrapper[4742]: I0317 12:20:56.861321 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/crc-debug-c99ct" Mar 17 12:20:56 crc kubenswrapper[4742]: I0317 12:20:56.902368 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rcph6/crc-debug-c99ct"] Mar 17 12:20:56 crc kubenswrapper[4742]: I0317 12:20:56.925886 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rcph6/crc-debug-c99ct"] Mar 17 12:20:56 crc kubenswrapper[4742]: I0317 12:20:56.986685 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk9kf\" (UniqueName: \"kubernetes.io/projected/7da57f47-676b-4be4-aa84-96b4c55e555b-kube-api-access-kk9kf\") pod \"7da57f47-676b-4be4-aa84-96b4c55e555b\" (UID: \"7da57f47-676b-4be4-aa84-96b4c55e555b\") " Mar 17 12:20:56 crc kubenswrapper[4742]: I0317 12:20:56.986776 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7da57f47-676b-4be4-aa84-96b4c55e555b-host\") pod \"7da57f47-676b-4be4-aa84-96b4c55e555b\" (UID: \"7da57f47-676b-4be4-aa84-96b4c55e555b\") " Mar 17 12:20:56 crc kubenswrapper[4742]: I0317 12:20:56.986941 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7da57f47-676b-4be4-aa84-96b4c55e555b-host" (OuterVolumeSpecName: "host") pod "7da57f47-676b-4be4-aa84-96b4c55e555b" (UID: "7da57f47-676b-4be4-aa84-96b4c55e555b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 12:20:56 crc kubenswrapper[4742]: I0317 12:20:56.987256 4742 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7da57f47-676b-4be4-aa84-96b4c55e555b-host\") on node \"crc\" DevicePath \"\"" Mar 17 12:20:56 crc kubenswrapper[4742]: I0317 12:20:56.994791 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7da57f47-676b-4be4-aa84-96b4c55e555b-kube-api-access-kk9kf" (OuterVolumeSpecName: "kube-api-access-kk9kf") pod "7da57f47-676b-4be4-aa84-96b4c55e555b" (UID: "7da57f47-676b-4be4-aa84-96b4c55e555b"). InnerVolumeSpecName "kube-api-access-kk9kf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:20:57 crc kubenswrapper[4742]: I0317 12:20:57.088818 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk9kf\" (UniqueName: \"kubernetes.io/projected/7da57f47-676b-4be4-aa84-96b4c55e555b-kube-api-access-kk9kf\") on node \"crc\" DevicePath \"\"" Mar 17 12:20:57 crc kubenswrapper[4742]: I0317 12:20:57.294441 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="930d53fd62906645754a2845e4bfed1a80bcef5f5a46f38f88bf892237e1ffdd" Mar 17 12:20:57 crc kubenswrapper[4742]: I0317 12:20:57.294518 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/crc-debug-c99ct" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.085177 4742 scope.go:117] "RemoveContainer" containerID="b6e2a03c5d15ddeff1558752e4c781d4491912fdaea734c00d450fbfb1474bec" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.102841 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rcph6/crc-debug-thwr9"] Mar 17 12:20:58 crc kubenswrapper[4742]: E0317 12:20:58.103256 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7da57f47-676b-4be4-aa84-96b4c55e555b" containerName="container-00" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.103275 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="7da57f47-676b-4be4-aa84-96b4c55e555b" containerName="container-00" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.103443 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="7da57f47-676b-4be4-aa84-96b4c55e555b" containerName="container-00" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.104027 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/crc-debug-thwr9" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.207842 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dldp6\" (UniqueName: \"kubernetes.io/projected/d274714f-0735-4c14-94c9-8ac28834edaa-kube-api-access-dldp6\") pod \"crc-debug-thwr9\" (UID: \"d274714f-0735-4c14-94c9-8ac28834edaa\") " pod="openshift-must-gather-rcph6/crc-debug-thwr9" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.207922 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d274714f-0735-4c14-94c9-8ac28834edaa-host\") pod \"crc-debug-thwr9\" (UID: \"d274714f-0735-4c14-94c9-8ac28834edaa\") " pod="openshift-must-gather-rcph6/crc-debug-thwr9" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.309498 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dldp6\" (UniqueName: \"kubernetes.io/projected/d274714f-0735-4c14-94c9-8ac28834edaa-kube-api-access-dldp6\") pod \"crc-debug-thwr9\" (UID: \"d274714f-0735-4c14-94c9-8ac28834edaa\") " pod="openshift-must-gather-rcph6/crc-debug-thwr9" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.309854 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d274714f-0735-4c14-94c9-8ac28834edaa-host\") pod \"crc-debug-thwr9\" (UID: \"d274714f-0735-4c14-94c9-8ac28834edaa\") " pod="openshift-must-gather-rcph6/crc-debug-thwr9" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.310029 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d274714f-0735-4c14-94c9-8ac28834edaa-host\") pod \"crc-debug-thwr9\" (UID: \"d274714f-0735-4c14-94c9-8ac28834edaa\") " pod="openshift-must-gather-rcph6/crc-debug-thwr9" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.331223 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dldp6\" (UniqueName: \"kubernetes.io/projected/d274714f-0735-4c14-94c9-8ac28834edaa-kube-api-access-dldp6\") pod \"crc-debug-thwr9\" (UID: \"d274714f-0735-4c14-94c9-8ac28834edaa\") " pod="openshift-must-gather-rcph6/crc-debug-thwr9" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.406281 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.458527 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/crc-debug-thwr9" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.466988 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xqrwr" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.677075 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7da57f47-676b-4be4-aa84-96b4c55e555b" path="/var/lib/kubelet/pods/7da57f47-676b-4be4-aa84-96b4c55e555b/volumes" Mar 17 12:20:58 crc kubenswrapper[4742]: I0317 12:20:58.992897 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xqrwr"] Mar 17 12:20:59 crc kubenswrapper[4742]: I0317 12:20:59.168760 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p52h7"] Mar 17 12:20:59 crc kubenswrapper[4742]: I0317 12:20:59.321246 4742 generic.go:334] "Generic (PLEG): container finished" podID="d274714f-0735-4c14-94c9-8ac28834edaa" containerID="706dcbd78243798a2cfedd66d81a5754800dfa98297238a642a8d63762a7f146" exitCode=0 Mar 17 12:20:59 crc kubenswrapper[4742]: I0317 12:20:59.321371 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcph6/crc-debug-thwr9" event={"ID":"d274714f-0735-4c14-94c9-8ac28834edaa","Type":"ContainerDied","Data":"706dcbd78243798a2cfedd66d81a5754800dfa98297238a642a8d63762a7f146"} Mar 17 12:20:59 crc kubenswrapper[4742]: I0317 12:20:59.321756 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcph6/crc-debug-thwr9" event={"ID":"d274714f-0735-4c14-94c9-8ac28834edaa","Type":"ContainerStarted","Data":"13a00d8a40d64a5b333c1d8c9093228d3f44877850d0fb0251bd5bcbcafb582e"} Mar 17 12:20:59 crc kubenswrapper[4742]: I0317 12:20:59.322034 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p52h7" podUID="a52e996e-9305-4a8a-bb51-9d2d72223dcf" containerName="registry-server" containerID="cri-o://4b75ee601018482e051623b5f21f20616250bd08c568777a7842b09df1c61585" gracePeriod=2 Mar 17 12:20:59 crc kubenswrapper[4742]: I0317 12:20:59.652233 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rcph6/crc-debug-thwr9"] Mar 17 12:20:59 crc kubenswrapper[4742]: I0317 12:20:59.659651 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rcph6/crc-debug-thwr9"] Mar 17 12:20:59 crc kubenswrapper[4742]: I0317 12:20:59.877477 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.038280 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52e996e-9305-4a8a-bb51-9d2d72223dcf-catalog-content\") pod \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\" (UID: \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\") " Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.038482 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52e996e-9305-4a8a-bb51-9d2d72223dcf-utilities\") pod \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\" (UID: \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\") " Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.038535 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fntfc\" (UniqueName: \"kubernetes.io/projected/a52e996e-9305-4a8a-bb51-9d2d72223dcf-kube-api-access-fntfc\") pod \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\" (UID: \"a52e996e-9305-4a8a-bb51-9d2d72223dcf\") " Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.038981 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a52e996e-9305-4a8a-bb51-9d2d72223dcf-utilities" (OuterVolumeSpecName: "utilities") pod "a52e996e-9305-4a8a-bb51-9d2d72223dcf" (UID: "a52e996e-9305-4a8a-bb51-9d2d72223dcf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.044212 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52e996e-9305-4a8a-bb51-9d2d72223dcf-kube-api-access-fntfc" (OuterVolumeSpecName: "kube-api-access-fntfc") pod "a52e996e-9305-4a8a-bb51-9d2d72223dcf" (UID: "a52e996e-9305-4a8a-bb51-9d2d72223dcf"). InnerVolumeSpecName "kube-api-access-fntfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.140763 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a52e996e-9305-4a8a-bb51-9d2d72223dcf-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.141010 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fntfc\" (UniqueName: \"kubernetes.io/projected/a52e996e-9305-4a8a-bb51-9d2d72223dcf-kube-api-access-fntfc\") on node \"crc\" DevicePath \"\"" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.226045 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a52e996e-9305-4a8a-bb51-9d2d72223dcf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a52e996e-9305-4a8a-bb51-9d2d72223dcf" (UID: "a52e996e-9305-4a8a-bb51-9d2d72223dcf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.242294 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a52e996e-9305-4a8a-bb51-9d2d72223dcf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.333585 4742 generic.go:334] "Generic (PLEG): container finished" podID="a52e996e-9305-4a8a-bb51-9d2d72223dcf" containerID="4b75ee601018482e051623b5f21f20616250bd08c568777a7842b09df1c61585" exitCode=0 Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.334106 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p52h7" event={"ID":"a52e996e-9305-4a8a-bb51-9d2d72223dcf","Type":"ContainerDied","Data":"4b75ee601018482e051623b5f21f20616250bd08c568777a7842b09df1c61585"} Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.334216 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p52h7" event={"ID":"a52e996e-9305-4a8a-bb51-9d2d72223dcf","Type":"ContainerDied","Data":"dbc2270af5ba8a5ca055d16f890c0f3f61f21ee36e4972292896ff3eaaccef54"} Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.334224 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p52h7" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.334246 4742 scope.go:117] "RemoveContainer" containerID="4b75ee601018482e051623b5f21f20616250bd08c568777a7842b09df1c61585" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.407791 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/crc-debug-thwr9" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.431568 4742 scope.go:117] "RemoveContainer" containerID="c6aabaf8b6c4c8c65a34c496d74c40cf04158398bb78638a263ce9a1f19f5816" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.435244 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p52h7"] Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.455106 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p52h7"] Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.463164 4742 scope.go:117] "RemoveContainer" containerID="626ca01c5ea635077f198085f767b451cdc90b535d7489db51a4162d4a1329ff" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.487310 4742 scope.go:117] "RemoveContainer" containerID="4b75ee601018482e051623b5f21f20616250bd08c568777a7842b09df1c61585" Mar 17 12:21:00 crc kubenswrapper[4742]: E0317 12:21:00.488035 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b75ee601018482e051623b5f21f20616250bd08c568777a7842b09df1c61585\": container with ID starting with 4b75ee601018482e051623b5f21f20616250bd08c568777a7842b09df1c61585 not found: ID does not exist" containerID="4b75ee601018482e051623b5f21f20616250bd08c568777a7842b09df1c61585" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.488061 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b75ee601018482e051623b5f21f20616250bd08c568777a7842b09df1c61585"} err="failed to get container status \"4b75ee601018482e051623b5f21f20616250bd08c568777a7842b09df1c61585\": rpc error: code = NotFound desc = could not find container \"4b75ee601018482e051623b5f21f20616250bd08c568777a7842b09df1c61585\": container with ID starting with 4b75ee601018482e051623b5f21f20616250bd08c568777a7842b09df1c61585 not found: ID does not exist" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.488081 4742 scope.go:117] "RemoveContainer" containerID="c6aabaf8b6c4c8c65a34c496d74c40cf04158398bb78638a263ce9a1f19f5816" Mar 17 12:21:00 crc kubenswrapper[4742]: E0317 12:21:00.488300 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6aabaf8b6c4c8c65a34c496d74c40cf04158398bb78638a263ce9a1f19f5816\": container with ID starting with c6aabaf8b6c4c8c65a34c496d74c40cf04158398bb78638a263ce9a1f19f5816 not found: ID does not exist" containerID="c6aabaf8b6c4c8c65a34c496d74c40cf04158398bb78638a263ce9a1f19f5816" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.488325 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6aabaf8b6c4c8c65a34c496d74c40cf04158398bb78638a263ce9a1f19f5816"} err="failed to get container status \"c6aabaf8b6c4c8c65a34c496d74c40cf04158398bb78638a263ce9a1f19f5816\": rpc error: code = NotFound desc = could not find container \"c6aabaf8b6c4c8c65a34c496d74c40cf04158398bb78638a263ce9a1f19f5816\": container with ID starting with c6aabaf8b6c4c8c65a34c496d74c40cf04158398bb78638a263ce9a1f19f5816 not found: ID does not exist" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.488337 4742 scope.go:117] "RemoveContainer" containerID="626ca01c5ea635077f198085f767b451cdc90b535d7489db51a4162d4a1329ff" Mar 17 12:21:00 crc kubenswrapper[4742]: E0317 12:21:00.489673 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626ca01c5ea635077f198085f767b451cdc90b535d7489db51a4162d4a1329ff\": container with ID starting with 626ca01c5ea635077f198085f767b451cdc90b535d7489db51a4162d4a1329ff not found: ID does not exist" containerID="626ca01c5ea635077f198085f767b451cdc90b535d7489db51a4162d4a1329ff" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.489702 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626ca01c5ea635077f198085f767b451cdc90b535d7489db51a4162d4a1329ff"} err="failed to get container status \"626ca01c5ea635077f198085f767b451cdc90b535d7489db51a4162d4a1329ff\": rpc error: code = NotFound desc = could not find container \"626ca01c5ea635077f198085f767b451cdc90b535d7489db51a4162d4a1329ff\": container with ID starting with 626ca01c5ea635077f198085f767b451cdc90b535d7489db51a4162d4a1329ff not found: ID does not exist" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.550275 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d274714f-0735-4c14-94c9-8ac28834edaa-host" (OuterVolumeSpecName: "host") pod "d274714f-0735-4c14-94c9-8ac28834edaa" (UID: "d274714f-0735-4c14-94c9-8ac28834edaa"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.551066 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d274714f-0735-4c14-94c9-8ac28834edaa-host\") pod \"d274714f-0735-4c14-94c9-8ac28834edaa\" (UID: \"d274714f-0735-4c14-94c9-8ac28834edaa\") " Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.551287 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dldp6\" (UniqueName: \"kubernetes.io/projected/d274714f-0735-4c14-94c9-8ac28834edaa-kube-api-access-dldp6\") pod \"d274714f-0735-4c14-94c9-8ac28834edaa\" (UID: \"d274714f-0735-4c14-94c9-8ac28834edaa\") " Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.551741 4742 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d274714f-0735-4c14-94c9-8ac28834edaa-host\") on node \"crc\" DevicePath \"\"" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.555020 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d274714f-0735-4c14-94c9-8ac28834edaa-kube-api-access-dldp6" (OuterVolumeSpecName: "kube-api-access-dldp6") pod "d274714f-0735-4c14-94c9-8ac28834edaa" (UID: "d274714f-0735-4c14-94c9-8ac28834edaa"). InnerVolumeSpecName "kube-api-access-dldp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.653298 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dldp6\" (UniqueName: \"kubernetes.io/projected/d274714f-0735-4c14-94c9-8ac28834edaa-kube-api-access-dldp6\") on node \"crc\" DevicePath \"\"" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.672985 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52e996e-9305-4a8a-bb51-9d2d72223dcf" path="/var/lib/kubelet/pods/a52e996e-9305-4a8a-bb51-9d2d72223dcf/volumes" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.674351 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d274714f-0735-4c14-94c9-8ac28834edaa" path="/var/lib/kubelet/pods/d274714f-0735-4c14-94c9-8ac28834edaa/volumes" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.904953 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rcph6/crc-debug-z59f4"] Mar 17 12:21:00 crc kubenswrapper[4742]: E0317 12:21:00.905317 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52e996e-9305-4a8a-bb51-9d2d72223dcf" containerName="extract-utilities" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.905332 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52e996e-9305-4a8a-bb51-9d2d72223dcf" containerName="extract-utilities" Mar 17 12:21:00 crc kubenswrapper[4742]: E0317 12:21:00.905352 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d274714f-0735-4c14-94c9-8ac28834edaa" containerName="container-00" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.905360 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="d274714f-0735-4c14-94c9-8ac28834edaa" containerName="container-00" Mar 17 12:21:00 crc kubenswrapper[4742]: E0317 12:21:00.905370 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52e996e-9305-4a8a-bb51-9d2d72223dcf" containerName="registry-server" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.905379 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52e996e-9305-4a8a-bb51-9d2d72223dcf" containerName="registry-server" Mar 17 12:21:00 crc kubenswrapper[4742]: E0317 12:21:00.905390 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52e996e-9305-4a8a-bb51-9d2d72223dcf" containerName="extract-content" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.905396 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52e996e-9305-4a8a-bb51-9d2d72223dcf" containerName="extract-content" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.905561 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="d274714f-0735-4c14-94c9-8ac28834edaa" containerName="container-00" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.905572 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52e996e-9305-4a8a-bb51-9d2d72223dcf" containerName="registry-server" Mar 17 12:21:00 crc kubenswrapper[4742]: I0317 12:21:00.906131 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/crc-debug-z59f4" Mar 17 12:21:01 crc kubenswrapper[4742]: I0317 12:21:01.060486 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb7f4d06-04cb-42d0-bcf6-f120e1d3536c-host\") pod \"crc-debug-z59f4\" (UID: \"fb7f4d06-04cb-42d0-bcf6-f120e1d3536c\") " pod="openshift-must-gather-rcph6/crc-debug-z59f4" Mar 17 12:21:01 crc kubenswrapper[4742]: I0317 12:21:01.060541 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xpsw\" (UniqueName: \"kubernetes.io/projected/fb7f4d06-04cb-42d0-bcf6-f120e1d3536c-kube-api-access-9xpsw\") pod \"crc-debug-z59f4\" (UID: \"fb7f4d06-04cb-42d0-bcf6-f120e1d3536c\") " pod="openshift-must-gather-rcph6/crc-debug-z59f4" Mar 17 12:21:01 crc kubenswrapper[4742]: I0317 12:21:01.162660 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb7f4d06-04cb-42d0-bcf6-f120e1d3536c-host\") pod \"crc-debug-z59f4\" (UID: \"fb7f4d06-04cb-42d0-bcf6-f120e1d3536c\") " pod="openshift-must-gather-rcph6/crc-debug-z59f4" Mar 17 12:21:01 crc kubenswrapper[4742]: I0317 12:21:01.162722 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xpsw\" (UniqueName: \"kubernetes.io/projected/fb7f4d06-04cb-42d0-bcf6-f120e1d3536c-kube-api-access-9xpsw\") pod \"crc-debug-z59f4\" (UID: \"fb7f4d06-04cb-42d0-bcf6-f120e1d3536c\") " pod="openshift-must-gather-rcph6/crc-debug-z59f4" Mar 17 12:21:01 crc kubenswrapper[4742]: I0317 12:21:01.162820 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb7f4d06-04cb-42d0-bcf6-f120e1d3536c-host\") pod \"crc-debug-z59f4\" (UID: \"fb7f4d06-04cb-42d0-bcf6-f120e1d3536c\") " pod="openshift-must-gather-rcph6/crc-debug-z59f4" Mar 17 12:21:01 crc kubenswrapper[4742]: I0317 12:21:01.183186 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xpsw\" (UniqueName: \"kubernetes.io/projected/fb7f4d06-04cb-42d0-bcf6-f120e1d3536c-kube-api-access-9xpsw\") pod \"crc-debug-z59f4\" (UID: \"fb7f4d06-04cb-42d0-bcf6-f120e1d3536c\") " pod="openshift-must-gather-rcph6/crc-debug-z59f4" Mar 17 12:21:01 crc kubenswrapper[4742]: I0317 12:21:01.277419 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/crc-debug-z59f4" Mar 17 12:21:01 crc kubenswrapper[4742]: I0317 12:21:01.343254 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcph6/crc-debug-z59f4" event={"ID":"fb7f4d06-04cb-42d0-bcf6-f120e1d3536c","Type":"ContainerStarted","Data":"e78b0b97ab6940b58f90572646d583118f52a39206ec93754291e3df432d5fec"} Mar 17 12:21:01 crc kubenswrapper[4742]: I0317 12:21:01.345899 4742 scope.go:117] "RemoveContainer" containerID="706dcbd78243798a2cfedd66d81a5754800dfa98297238a642a8d63762a7f146" Mar 17 12:21:01 crc kubenswrapper[4742]: I0317 12:21:01.346016 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/crc-debug-thwr9" Mar 17 12:21:02 crc kubenswrapper[4742]: I0317 12:21:02.359892 4742 generic.go:334] "Generic (PLEG): container finished" podID="fb7f4d06-04cb-42d0-bcf6-f120e1d3536c" containerID="e4bde9daaa5c471adad2185d1c6913632198048ccc6d76dfc8a060884af67d07" exitCode=0 Mar 17 12:21:02 crc kubenswrapper[4742]: I0317 12:21:02.360052 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcph6/crc-debug-z59f4" event={"ID":"fb7f4d06-04cb-42d0-bcf6-f120e1d3536c","Type":"ContainerDied","Data":"e4bde9daaa5c471adad2185d1c6913632198048ccc6d76dfc8a060884af67d07"} Mar 17 12:21:02 crc kubenswrapper[4742]: I0317 12:21:02.395291 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rcph6/crc-debug-z59f4"] Mar 17 12:21:02 crc kubenswrapper[4742]: I0317 12:21:02.408155 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rcph6/crc-debug-z59f4"] Mar 17 12:21:03 crc kubenswrapper[4742]: I0317 12:21:03.483805 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/crc-debug-z59f4" Mar 17 12:21:03 crc kubenswrapper[4742]: I0317 12:21:03.605003 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb7f4d06-04cb-42d0-bcf6-f120e1d3536c-host\") pod \"fb7f4d06-04cb-42d0-bcf6-f120e1d3536c\" (UID: \"fb7f4d06-04cb-42d0-bcf6-f120e1d3536c\") " Mar 17 12:21:03 crc kubenswrapper[4742]: I0317 12:21:03.605055 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xpsw\" (UniqueName: \"kubernetes.io/projected/fb7f4d06-04cb-42d0-bcf6-f120e1d3536c-kube-api-access-9xpsw\") pod \"fb7f4d06-04cb-42d0-bcf6-f120e1d3536c\" (UID: \"fb7f4d06-04cb-42d0-bcf6-f120e1d3536c\") " Mar 17 12:21:03 crc kubenswrapper[4742]: I0317 12:21:03.605558 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb7f4d06-04cb-42d0-bcf6-f120e1d3536c-host" (OuterVolumeSpecName: "host") pod "fb7f4d06-04cb-42d0-bcf6-f120e1d3536c" (UID: "fb7f4d06-04cb-42d0-bcf6-f120e1d3536c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 12:21:03 crc kubenswrapper[4742]: I0317 12:21:03.605729 4742 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fb7f4d06-04cb-42d0-bcf6-f120e1d3536c-host\") on node \"crc\" DevicePath \"\"" Mar 17 12:21:03 crc kubenswrapper[4742]: I0317 12:21:03.620692 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb7f4d06-04cb-42d0-bcf6-f120e1d3536c-kube-api-access-9xpsw" (OuterVolumeSpecName: "kube-api-access-9xpsw") pod "fb7f4d06-04cb-42d0-bcf6-f120e1d3536c" (UID: "fb7f4d06-04cb-42d0-bcf6-f120e1d3536c"). InnerVolumeSpecName "kube-api-access-9xpsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:21:03 crc kubenswrapper[4742]: I0317 12:21:03.707334 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xpsw\" (UniqueName: \"kubernetes.io/projected/fb7f4d06-04cb-42d0-bcf6-f120e1d3536c-kube-api-access-9xpsw\") on node \"crc\" DevicePath \"\"" Mar 17 12:21:04 crc kubenswrapper[4742]: I0317 12:21:04.376313 4742 scope.go:117] "RemoveContainer" containerID="e4bde9daaa5c471adad2185d1c6913632198048ccc6d76dfc8a060884af67d07" Mar 17 12:21:04 crc kubenswrapper[4742]: I0317 12:21:04.376370 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/crc-debug-z59f4" Mar 17 12:21:04 crc kubenswrapper[4742]: I0317 12:21:04.673778 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb7f4d06-04cb-42d0-bcf6-f120e1d3536c" path="/var/lib/kubelet/pods/fb7f4d06-04cb-42d0-bcf6-f120e1d3536c/volumes" Mar 17 12:21:39 crc kubenswrapper[4742]: I0317 12:21:39.936513 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f76787fd-cvxz9_f550d045-d552-4ea9-b5c8-a4e7d9ff29a1/barbican-api/0.log" Mar 17 12:21:40 crc kubenswrapper[4742]: I0317 12:21:40.106393 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f76787fd-cvxz9_f550d045-d552-4ea9-b5c8-a4e7d9ff29a1/barbican-api-log/0.log" Mar 17 12:21:40 crc kubenswrapper[4742]: I0317 12:21:40.137506 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-bf65fb77d-664w7_8ac953fc-7316-4941-920f-8298fd752c3a/barbican-keystone-listener-log/0.log" Mar 17 12:21:40 crc kubenswrapper[4742]: I0317 12:21:40.168418 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-bf65fb77d-664w7_8ac953fc-7316-4941-920f-8298fd752c3a/barbican-keystone-listener/0.log" Mar 17 12:21:40 crc kubenswrapper[4742]: I0317 12:21:40.339532 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-68ddcd6d89-6jx5j_1b377427-ca51-4054-9725-545bba6b9319/barbican-worker-log/0.log" Mar 17 12:21:40 crc kubenswrapper[4742]: I0317 12:21:40.346558 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-68ddcd6d89-6jx5j_1b377427-ca51-4054-9725-545bba6b9319/barbican-worker/0.log" Mar 17 12:21:40 crc kubenswrapper[4742]: I0317 12:21:40.501764 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4q8xc_e6bf81f0-73d3-4dde-937d-87bbea94c36e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:40 crc kubenswrapper[4742]: I0317 12:21:40.552670 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea32fef3-81ea-41cb-8641-3a43304683c6/ceilometer-central-agent/0.log" Mar 17 12:21:40 crc kubenswrapper[4742]: I0317 12:21:40.603637 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea32fef3-81ea-41cb-8641-3a43304683c6/ceilometer-notification-agent/0.log" Mar 17 12:21:40 crc kubenswrapper[4742]: I0317 12:21:40.689981 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea32fef3-81ea-41cb-8641-3a43304683c6/proxy-httpd/0.log" Mar 17 12:21:40 crc kubenswrapper[4742]: I0317 12:21:40.743573 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ea32fef3-81ea-41cb-8641-3a43304683c6/sg-core/0.log" Mar 17 12:21:40 crc kubenswrapper[4742]: I0317 12:21:40.856062 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e36e2fb7-b344-4c81-9922-3d9bc9526261/cinder-api/0.log" Mar 17 12:21:40 crc kubenswrapper[4742]: I0317 12:21:40.934621 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e36e2fb7-b344-4c81-9922-3d9bc9526261/cinder-api-log/0.log" Mar 17 12:21:41 crc kubenswrapper[4742]: I0317 12:21:41.056038 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_12f6380b-463f-4c5b-9c4a-809c874b2ca5/cinder-scheduler/0.log" Mar 17 12:21:41 crc kubenswrapper[4742]: I0317 12:21:41.072959 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_12f6380b-463f-4c5b-9c4a-809c874b2ca5/probe/0.log" Mar 17 12:21:41 crc kubenswrapper[4742]: I0317 12:21:41.185894 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6nvrh_bd4b8d37-8f12-4560-b616-cbbed45a7cb2/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:41 crc kubenswrapper[4742]: I0317 12:21:41.299340 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pcmpv_bfb05f67-f7aa-480f-a4e9-3f24ee2102d4/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:41 crc kubenswrapper[4742]: I0317 12:21:41.533611 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-z42hc_d3035223-2765-4ce8-ac14-f53ffcca7a1b/init/0.log" Mar 17 12:21:41 crc kubenswrapper[4742]: I0317 12:21:41.692504 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-z42hc_d3035223-2765-4ce8-ac14-f53ffcca7a1b/init/0.log" Mar 17 12:21:41 crc kubenswrapper[4742]: I0317 12:21:41.711605 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-z42hc_d3035223-2765-4ce8-ac14-f53ffcca7a1b/dnsmasq-dns/0.log" Mar 17 12:21:41 crc kubenswrapper[4742]: I0317 12:21:41.752146 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mvrds_a8691841-aa32-407b-bbdc-97c5551ec591/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:41 crc kubenswrapper[4742]: I0317 12:21:41.914535 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fdc48ac3-7501-4e63-9290-bff06909b045/glance-httpd/0.log" Mar 17 12:21:41 crc kubenswrapper[4742]: I0317 12:21:41.932204 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_fdc48ac3-7501-4e63-9290-bff06909b045/glance-log/0.log" Mar 17 12:21:42 crc kubenswrapper[4742]: I0317 12:21:42.088782 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c030ab26-9079-49cf-837f-c0625cfe6cc3/glance-httpd/0.log" Mar 17 12:21:42 crc kubenswrapper[4742]: I0317 12:21:42.104312 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c030ab26-9079-49cf-837f-c0625cfe6cc3/glance-log/0.log" Mar 17 12:21:42 crc kubenswrapper[4742]: I0317 12:21:42.250440 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5c4556b444-kq454_480fea20-eab5-4c68-9bc3-9b218ba0b43d/horizon/0.log" Mar 17 12:21:42 crc kubenswrapper[4742]: I0317 12:21:42.491240 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-6g95k_62491de6-4c04-49d7-82f2-124f6cceff11/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:42 crc kubenswrapper[4742]: I0317 12:21:42.673692 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lk8g5_71aa9411-3abc-46dd-9907-3f2847f83866/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:42 crc kubenswrapper[4742]: I0317 12:21:42.728355 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5c4556b444-kq454_480fea20-eab5-4c68-9bc3-9b218ba0b43d/horizon-log/0.log" Mar 17 12:21:42 crc kubenswrapper[4742]: I0317 12:21:42.908162 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29562481-pwxss_7cfb9cd7-2718-4547-a238-e62cfa4f3cb5/keystone-cron/0.log" Mar 17 12:21:42 crc kubenswrapper[4742]: I0317 12:21:42.955962 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cf69c6b9b-d9hmq_896b4ef2-200c-4981-b22f-d93e9979c130/keystone-api/0.log" Mar 17 12:21:43 crc kubenswrapper[4742]: I0317 12:21:43.085730 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_47db9f5f-1a39-4137-bc97-bf3192c64ced/kube-state-metrics/0.log" Mar 17 12:21:43 crc kubenswrapper[4742]: I0317 12:21:43.161460 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vwbv9_7fd024b3-844f-4118-92b5-81dcc6da9fd6/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:43 crc kubenswrapper[4742]: I0317 12:21:43.414338 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c7d48c699-86xxh_1ccfa960-12b9-4537-b822-89da493f780c/neutron-httpd/0.log" Mar 17 12:21:43 crc kubenswrapper[4742]: I0317 12:21:43.498484 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-c7d48c699-86xxh_1ccfa960-12b9-4537-b822-89da493f780c/neutron-api/0.log" Mar 17 12:21:43 crc kubenswrapper[4742]: I0317 12:21:43.636143 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-462xt_764bf75a-9487-4005-b6ee-ca369e722c4a/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:44 crc kubenswrapper[4742]: I0317 12:21:44.097140 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f2b889c9-de23-4357-956c-1684e42c64de/nova-api-log/0.log" Mar 17 12:21:44 crc kubenswrapper[4742]: I0317 12:21:44.205295 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_32a50429-d785-408b-b53f-fef4700692c6/nova-cell0-conductor-conductor/0.log" Mar 17 12:21:44 crc kubenswrapper[4742]: I0317 12:21:44.442929 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_47ecd8fa-016c-43b5-9d9f-42c776c8e38d/nova-cell1-conductor-conductor/0.log" Mar 17 12:21:44 crc kubenswrapper[4742]: I0317 12:21:44.608524 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_00a7a363-ec82-40a4-8121-fd6839727132/nova-cell1-novncproxy-novncproxy/0.log" Mar 17 12:21:44 crc kubenswrapper[4742]: I0317 12:21:44.617347 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f2b889c9-de23-4357-956c-1684e42c64de/nova-api-api/0.log" Mar 17 12:21:45 crc kubenswrapper[4742]: I0317 12:21:45.101489 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-76jn7_6468192a-58e3-4b66-9551-1d67dc93f0ae/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:45 crc kubenswrapper[4742]: I0317 12:21:45.156002 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8f6a1398-04d6-4668-9689-17bdbb214850/nova-metadata-log/0.log" Mar 17 12:21:45 crc kubenswrapper[4742]: I0317 12:21:45.451921 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_5c8591a2-6548-4bcb-8be3-71e549605bd2/nova-scheduler-scheduler/0.log" Mar 17 12:21:45 crc kubenswrapper[4742]: I0317 12:21:45.553061 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5/mysql-bootstrap/0.log" Mar 17 12:21:45 crc kubenswrapper[4742]: I0317 12:21:45.691991 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5/mysql-bootstrap/0.log" Mar 17 12:21:45 crc kubenswrapper[4742]: I0317 12:21:45.707822 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ae6eb94b-ef26-4f1a-b3b8-a0300262d4e5/galera/0.log" Mar 17 12:21:45 crc kubenswrapper[4742]: I0317 12:21:45.728868 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8f6a1398-04d6-4668-9689-17bdbb214850/nova-metadata-metadata/0.log" Mar 17 12:21:45 crc kubenswrapper[4742]: I0317 12:21:45.904130 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_91d27a2f-a471-4f90-aabb-9a021036805e/mysql-bootstrap/0.log" Mar 17 12:21:46 crc kubenswrapper[4742]: I0317 12:21:46.951318 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_91d27a2f-a471-4f90-aabb-9a021036805e/galera/0.log" Mar 17 12:21:46 crc kubenswrapper[4742]: I0317 12:21:46.962807 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_91d27a2f-a471-4f90-aabb-9a021036805e/mysql-bootstrap/0.log" Mar 17 12:21:46 crc kubenswrapper[4742]: I0317 12:21:46.988749 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_11e12da8-9e80-453f-bbbd-03d1346afe5b/openstackclient/0.log" Mar 17 12:21:47 crc kubenswrapper[4742]: I0317 12:21:47.137534 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-4j5jz_158a0d7f-e22f-4f44-aca2-efb59ff90439/ovn-controller/0.log" Mar 17 12:21:47 crc kubenswrapper[4742]: I0317 12:21:47.177347 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pmxjd_0a50ef5e-ba73-4d00-baba-b8ef6c621d71/openstack-network-exporter/0.log" Mar 17 12:21:47 crc kubenswrapper[4742]: I0317 12:21:47.328005 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dmqzv_dd5cf259-c4bf-44cf-b101-bcc78c153852/ovsdb-server-init/0.log" Mar 17 12:21:47 crc kubenswrapper[4742]: I0317 12:21:47.597924 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dmqzv_dd5cf259-c4bf-44cf-b101-bcc78c153852/ovsdb-server-init/0.log" Mar 17 12:21:47 crc kubenswrapper[4742]: I0317 12:21:47.610058 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dmqzv_dd5cf259-c4bf-44cf-b101-bcc78c153852/ovsdb-server/0.log" Mar 17 12:21:47 crc kubenswrapper[4742]: I0317 12:21:47.625819 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-dmqzv_dd5cf259-c4bf-44cf-b101-bcc78c153852/ovs-vswitchd/0.log" Mar 17 12:21:47 crc kubenswrapper[4742]: I0317 12:21:47.803200 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_56194d57-077f-40f4-87f6-386942ac0f6b/openstack-network-exporter/0.log" Mar 17 12:21:47 crc kubenswrapper[4742]: I0317 12:21:47.863503 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-zgs6w_9e7470ef-476f-4d0e-b7ec-349fbc6eff76/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:47 crc kubenswrapper[4742]: I0317 12:21:47.970486 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_56194d57-077f-40f4-87f6-386942ac0f6b/ovn-northd/0.log" Mar 17 12:21:48 crc kubenswrapper[4742]: I0317 12:21:48.079541 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7a4f3d5f-526a-4163-8dbb-a019050a0e03/openstack-network-exporter/0.log" Mar 17 12:21:48 crc kubenswrapper[4742]: I0317 12:21:48.169026 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7a4f3d5f-526a-4163-8dbb-a019050a0e03/ovsdbserver-nb/0.log" Mar 17 12:21:48 crc kubenswrapper[4742]: I0317 12:21:48.278093 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3bba6aef-f8ff-436a-b3c1-97fbe9819ff1/openstack-network-exporter/0.log" Mar 17 12:21:48 crc kubenswrapper[4742]: I0317 12:21:48.290457 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3bba6aef-f8ff-436a-b3c1-97fbe9819ff1/ovsdbserver-sb/0.log" Mar 17 12:21:48 crc kubenswrapper[4742]: I0317 12:21:48.551771 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6976ff4586-bgqjp_221187ef-dec0-47dd-894e-ff9f2d1daa09/placement-api/0.log" Mar 17 12:21:48 crc kubenswrapper[4742]: I0317 12:21:48.569047 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6976ff4586-bgqjp_221187ef-dec0-47dd-894e-ff9f2d1daa09/placement-log/0.log" Mar 17 12:21:48 crc kubenswrapper[4742]: I0317 12:21:48.643087 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6c10e471-26c3-41ec-bf47-a5edf33c173d/setup-container/0.log" Mar 17 12:21:48 crc kubenswrapper[4742]: I0317 12:21:48.867251 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6c10e471-26c3-41ec-bf47-a5edf33c173d/setup-container/0.log" Mar 17 12:21:48 crc kubenswrapper[4742]: I0317 12:21:48.868281 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6c10e471-26c3-41ec-bf47-a5edf33c173d/rabbitmq/0.log" Mar 17 12:21:48 crc kubenswrapper[4742]: I0317 12:21:48.881634 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4e8c9887-8315-444e-b3dd-9753e83f83fa/setup-container/0.log" Mar 17 12:21:49 crc kubenswrapper[4742]: I0317 12:21:49.042851 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4e8c9887-8315-444e-b3dd-9753e83f83fa/setup-container/0.log" Mar 17 12:21:49 crc kubenswrapper[4742]: I0317 12:21:49.125252 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4e8c9887-8315-444e-b3dd-9753e83f83fa/rabbitmq/0.log" Mar 17 12:21:49 crc kubenswrapper[4742]: I0317 12:21:49.227077 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-24h9p_aa52e3ae-e09a-4561-990a-59358b9b17b6/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:49 crc kubenswrapper[4742]: I0317 12:21:49.429329 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-m9bs4_abeb089b-7a3b-4ab5-b412-f6d7b7fd0c7f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:49 crc kubenswrapper[4742]: I0317 12:21:49.448549 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-l8dw6_529b4c5a-8be2-4820-b06a-11eb75c3dc3b/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:49 crc kubenswrapper[4742]: I0317 12:21:49.619148 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7mg2n_2c1f61c9-540b-4044-ba34-2bb110401fa0/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:49 crc kubenswrapper[4742]: I0317 12:21:49.665849 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-p4fvs_ba90bc1a-0e57-455d-8594-4e11b1548097/ssh-known-hosts-edpm-deployment/0.log" Mar 17 12:21:49 crc kubenswrapper[4742]: I0317 12:21:49.868361 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c96b95bb7-ckpvc_b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe/proxy-server/0.log" Mar 17 12:21:49 crc kubenswrapper[4742]: I0317 12:21:49.953814 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-c96b95bb7-ckpvc_b53f3e20-97c9-4ea5-b2c4-9ce6e370acbe/proxy-httpd/0.log" Mar 17 12:21:49 crc kubenswrapper[4742]: I0317 12:21:49.993595 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-rrnw9_3cc5195f-ecc0-4f8e-bc53-ea602fff501d/swift-ring-rebalance/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.099758 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/account-auditor/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.179263 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/account-reaper/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.223248 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/account-replicator/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.297334 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/container-auditor/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.349770 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/account-server/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.429532 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/container-server/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.434498 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/container-replicator/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.553241 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/container-updater/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.563445 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/object-auditor/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.630950 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/object-expirer/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.642670 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/object-replicator/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.783813 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/object-server/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.810563 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/object-updater/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.845111 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/swift-recon-cron/0.log" Mar 17 12:21:50 crc kubenswrapper[4742]: I0317 12:21:50.853472 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_be22c821-2e25-47ed-938d-c748fc55a4c6/rsync/0.log" Mar 17 12:21:51 crc kubenswrapper[4742]: I0317 12:21:51.083011 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_cbe323de-3d55-4905-8f28-29cea959ae35/tempest-tests-tempest-tests-runner/0.log" Mar 17 12:21:51 crc kubenswrapper[4742]: I0317 12:21:51.114556 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-5bdxw_24003f05-4f7d-443d-8a19-8162dae339a2/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:21:51 crc kubenswrapper[4742]: I0317 12:21:51.280057 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6bfbc7cf-c913-4297-a60e-307a3829b636/test-operator-logs-container/0.log" Mar 17 12:21:51 crc kubenswrapper[4742]: I0317 12:21:51.447363 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-th85k_fe59da59-475f-4c7d-ab34-f3085125c224/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 17 12:22:00 crc kubenswrapper[4742]: I0317 12:22:00.145989 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562502-mqx27"] Mar 17 12:22:00 crc kubenswrapper[4742]: E0317 12:22:00.147080 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7f4d06-04cb-42d0-bcf6-f120e1d3536c" containerName="container-00" Mar 17 12:22:00 crc kubenswrapper[4742]: I0317 12:22:00.147099 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7f4d06-04cb-42d0-bcf6-f120e1d3536c" containerName="container-00" Mar 17 12:22:00 crc kubenswrapper[4742]: I0317 12:22:00.147395 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb7f4d06-04cb-42d0-bcf6-f120e1d3536c" containerName="container-00" Mar 17 12:22:00 crc kubenswrapper[4742]: I0317 12:22:00.148205 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562502-mqx27" Mar 17 12:22:00 crc kubenswrapper[4742]: I0317 12:22:00.153792 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:22:00 crc kubenswrapper[4742]: I0317 12:22:00.153827 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:22:00 crc kubenswrapper[4742]: I0317 12:22:00.153949 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:22:00 crc kubenswrapper[4742]: I0317 12:22:00.155709 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562502-mqx27"] Mar 17 12:22:00 crc kubenswrapper[4742]: I0317 12:22:00.270975 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m96d\" (UniqueName: \"kubernetes.io/projected/c09ddacc-c7ba-4eae-afd2-dc4ad528c497-kube-api-access-5m96d\") pod \"auto-csr-approver-29562502-mqx27\" (UID: \"c09ddacc-c7ba-4eae-afd2-dc4ad528c497\") " pod="openshift-infra/auto-csr-approver-29562502-mqx27" Mar 17 12:22:00 crc kubenswrapper[4742]: I0317 12:22:00.372108 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m96d\" (UniqueName: \"kubernetes.io/projected/c09ddacc-c7ba-4eae-afd2-dc4ad528c497-kube-api-access-5m96d\") pod \"auto-csr-approver-29562502-mqx27\" (UID: \"c09ddacc-c7ba-4eae-afd2-dc4ad528c497\") " pod="openshift-infra/auto-csr-approver-29562502-mqx27" Mar 17 12:22:00 crc kubenswrapper[4742]: I0317 12:22:00.394821 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m96d\" (UniqueName: \"kubernetes.io/projected/c09ddacc-c7ba-4eae-afd2-dc4ad528c497-kube-api-access-5m96d\") pod \"auto-csr-approver-29562502-mqx27\" (UID: \"c09ddacc-c7ba-4eae-afd2-dc4ad528c497\") " pod="openshift-infra/auto-csr-approver-29562502-mqx27" Mar 17 12:22:00 crc kubenswrapper[4742]: I0317 12:22:00.476660 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562502-mqx27" Mar 17 12:22:00 crc kubenswrapper[4742]: I0317 12:22:00.909025 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562502-mqx27"] Mar 17 12:22:01 crc kubenswrapper[4742]: I0317 12:22:01.939377 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562502-mqx27" event={"ID":"c09ddacc-c7ba-4eae-afd2-dc4ad528c497","Type":"ContainerStarted","Data":"5815a90a650b5895638a724001f8a359f91e055146b0fc4d5c2b77d4a564a390"} Mar 17 12:22:02 crc kubenswrapper[4742]: I0317 12:22:02.635851 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5cbf7636-aea9-4186-be9f-a4b25776158c/memcached/0.log" Mar 17 12:22:02 crc kubenswrapper[4742]: I0317 12:22:02.948274 4742 generic.go:334] "Generic (PLEG): container finished" podID="c09ddacc-c7ba-4eae-afd2-dc4ad528c497" containerID="b73bf47f612cd79dab2317473f033cfcc59a52b6a0baf462b57bcd87d5c4dd23" exitCode=0 Mar 17 12:22:02 crc kubenswrapper[4742]: I0317 12:22:02.948509 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562502-mqx27" event={"ID":"c09ddacc-c7ba-4eae-afd2-dc4ad528c497","Type":"ContainerDied","Data":"b73bf47f612cd79dab2317473f033cfcc59a52b6a0baf462b57bcd87d5c4dd23"} Mar 17 12:22:04 crc kubenswrapper[4742]: I0317 12:22:04.324887 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562502-mqx27" Mar 17 12:22:04 crc kubenswrapper[4742]: I0317 12:22:04.459137 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m96d\" (UniqueName: \"kubernetes.io/projected/c09ddacc-c7ba-4eae-afd2-dc4ad528c497-kube-api-access-5m96d\") pod \"c09ddacc-c7ba-4eae-afd2-dc4ad528c497\" (UID: \"c09ddacc-c7ba-4eae-afd2-dc4ad528c497\") " Mar 17 12:22:04 crc kubenswrapper[4742]: I0317 12:22:04.477636 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09ddacc-c7ba-4eae-afd2-dc4ad528c497-kube-api-access-5m96d" (OuterVolumeSpecName: "kube-api-access-5m96d") pod "c09ddacc-c7ba-4eae-afd2-dc4ad528c497" (UID: "c09ddacc-c7ba-4eae-afd2-dc4ad528c497"). InnerVolumeSpecName "kube-api-access-5m96d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:22:04 crc kubenswrapper[4742]: I0317 12:22:04.561722 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m96d\" (UniqueName: \"kubernetes.io/projected/c09ddacc-c7ba-4eae-afd2-dc4ad528c497-kube-api-access-5m96d\") on node \"crc\" DevicePath \"\"" Mar 17 12:22:04 crc kubenswrapper[4742]: I0317 12:22:04.975932 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562502-mqx27" event={"ID":"c09ddacc-c7ba-4eae-afd2-dc4ad528c497","Type":"ContainerDied","Data":"5815a90a650b5895638a724001f8a359f91e055146b0fc4d5c2b77d4a564a390"} Mar 17 12:22:04 crc kubenswrapper[4742]: I0317 12:22:04.976216 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5815a90a650b5895638a724001f8a359f91e055146b0fc4d5c2b77d4a564a390" Mar 17 12:22:04 crc kubenswrapper[4742]: I0317 12:22:04.975983 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562502-mqx27" Mar 17 12:22:05 crc kubenswrapper[4742]: I0317 12:22:05.413056 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562496-2k8gz"] Mar 17 12:22:05 crc kubenswrapper[4742]: I0317 12:22:05.424166 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562496-2k8gz"] Mar 17 12:22:06 crc kubenswrapper[4742]: I0317 12:22:06.674616 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58" path="/var/lib/kubelet/pods/d11ed4b7-8ca5-4d19-b9d9-a3d8c339ce58/volumes" Mar 17 12:22:19 crc kubenswrapper[4742]: I0317 12:22:19.563810 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-g729j_45257cde-ca39-4e50-b465-b76ea15e179e/manager/0.log" Mar 17 12:22:19 crc kubenswrapper[4742]: I0317 12:22:19.746232 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/util/0.log" Mar 17 12:22:20 crc kubenswrapper[4742]: I0317 12:22:20.016758 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/pull/0.log" Mar 17 12:22:20 crc kubenswrapper[4742]: I0317 12:22:20.025394 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/util/0.log" Mar 17 12:22:20 crc kubenswrapper[4742]: I0317 12:22:20.122946 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/pull/0.log" Mar 17 12:22:20 crc kubenswrapper[4742]: I0317 12:22:20.223785 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/util/0.log" Mar 17 12:22:20 crc kubenswrapper[4742]: I0317 12:22:20.284725 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/extract/0.log" Mar 17 12:22:20 crc kubenswrapper[4742]: I0317 12:22:20.285395 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dd155995c195bec3e3628bdb7ebb30a28da5be9a7c077541e67123b9737586q_e6b4bfa7-c424-4a08-8a06-f73809217eff/pull/0.log" Mar 17 12:22:20 crc kubenswrapper[4742]: I0317 12:22:20.490107 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-j5sfj_01ae7820-ca74-4237-ac4a-82b3605f2306/manager/0.log" Mar 17 12:22:20 crc kubenswrapper[4742]: I0317 12:22:20.721230 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-sq2xc_9b21605a-2c83-49df-ae0f-dfb172a1b9f5/manager/0.log" Mar 17 12:22:20 crc kubenswrapper[4742]: I0317 12:22:20.729431 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-6z2xv_a7d611a7-9728-4738-8efa-80883aa13b2b/manager/0.log" Mar 17 12:22:20 crc kubenswrapper[4742]: I0317 12:22:20.918554 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-znwjl_c8ccb584-e9e1-4eba-827e-3e7197f3133f/manager/0.log" Mar 17 12:22:21 crc kubenswrapper[4742]: I0317 12:22:21.141813 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-4phr7_27150936-220d-4247-b873-10add7124430/manager/0.log" Mar 17 12:22:21 crc kubenswrapper[4742]: I0317 12:22:21.226155 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-4mj6d_1cdb0787-4a2a-41f6-aed0-8693b2669444/manager/0.log" Mar 17 12:22:21 crc kubenswrapper[4742]: I0317 12:22:21.394959 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-njktv_c1f29dbe-e3d8-4dc0-aafe-fcd1de367544/manager/0.log" Mar 17 12:22:21 crc kubenswrapper[4742]: I0317 12:22:21.472986 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-dvbmd_b3928371-ca20-41d9-8200-36410c2df752/manager/0.log" Mar 17 12:22:21 crc kubenswrapper[4742]: I0317 12:22:21.526358 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-xjp4g_f91fb07a-de67-44ff-b6af-446891941a60/manager/0.log" Mar 17 12:22:21 crc kubenswrapper[4742]: I0317 12:22:21.659680 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-7ttcf_46b5befe-2274-4bc8-a2c4-ce8a9fc915ae/manager/0.log" Mar 17 12:22:21 crc kubenswrapper[4742]: I0317 12:22:21.743056 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-g252s_0436441e-c132-4c65-aee5-8b20461c12e1/manager/0.log" Mar 17 12:22:21 crc kubenswrapper[4742]: I0317 12:22:21.902534 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-vshmg_88b49b71-3d6b-4ca0-8943-c0d0c10b9ff9/manager/0.log" Mar 17 12:22:21 crc kubenswrapper[4742]: I0317 12:22:21.989782 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-4fvjv_c59e15b4-2341-4b9e-8887-d6b1f594dc0e/manager/0.log" Mar 17 12:22:22 crc kubenswrapper[4742]: I0317 12:22:22.079434 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-89w9s_7a86c487-f0e3-40bf-a1fe-e70e97a0d8c0/manager/0.log" Mar 17 12:22:22 crc kubenswrapper[4742]: I0317 12:22:22.258777 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-58b7c959b5-zkf6c_30159976-f1ef-435e-b6e6-995553b51f65/operator/0.log" Mar 17 12:22:22 crc kubenswrapper[4742]: I0317 12:22:22.478608 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-d2ktx_37e024f1-44f6-48c9-ba86-323127371c28/registry-server/0.log" Mar 17 12:22:22 crc kubenswrapper[4742]: I0317 12:22:22.676457 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-zvlv9_9470c17e-90c4-4723-b3ef-af8ec6f1edc2/manager/0.log" Mar 17 12:22:22 crc kubenswrapper[4742]: I0317 12:22:22.774982 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-c9j2m_5e3c7784-527e-4f97-b035-240b7014241f/manager/0.log" Mar 17 12:22:22 crc kubenswrapper[4742]: I0317 12:22:22.962516 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-44jhz_48d26de5-4809-4a61-82c3-03cbf56c57b0/operator/0.log" Mar 17 12:22:23 crc kubenswrapper[4742]: I0317 12:22:23.162157 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-fh8v8_53837a21-9249-4ff8-aa95-bdfbb6d49f33/manager/0.log" Mar 17 12:22:23 crc kubenswrapper[4742]: I0317 12:22:23.339223 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-rzpkl_f42b3e9f-55a9-47fe-a5b8-51b36d622657/manager/0.log" Mar 17 12:22:23 crc kubenswrapper[4742]: I0317 12:22:23.527598 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c748c4754-6hffs_7d6829e2-3788-4653-91e4-bff007a7bb5d/manager/0.log" Mar 17 12:22:23 crc kubenswrapper[4742]: I0317 12:22:23.652822 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-rf6p4_0eaedaeb-8d0d-4fde-8b74-cdd689d56123/manager/0.log" Mar 17 12:22:23 crc kubenswrapper[4742]: I0317 12:22:23.686805 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-fhqr4_b6a6e1ca-6c30-4a35-bd0c-b700160fe8ee/manager/0.log" Mar 17 12:22:45 crc kubenswrapper[4742]: I0317 12:22:45.200262 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-t2nj8_957049a3-8921-4ec9-a66c-d0fe15848fad/control-plane-machine-set-operator/0.log" Mar 17 12:22:45 crc kubenswrapper[4742]: I0317 12:22:45.308059 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bc2zs_76ed03a0-90ee-4e37-9580-d7136a7fdc5e/kube-rbac-proxy/0.log" Mar 17 12:22:45 crc kubenswrapper[4742]: I0317 12:22:45.382303 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bc2zs_76ed03a0-90ee-4e37-9580-d7136a7fdc5e/machine-api-operator/0.log" Mar 17 12:22:58 crc kubenswrapper[4742]: I0317 12:22:58.234137 4742 scope.go:117] "RemoveContainer" containerID="a55fafc0d043385648298663871f05f345dfcffd91cbd9080a7503bc6db8ce64" Mar 17 12:22:58 crc kubenswrapper[4742]: I0317 12:22:58.857648 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-ncl69_fb8bea11-37f9-43cf-9a3c-07e54ebca5fa/cert-manager-controller/0.log" Mar 17 12:22:59 crc kubenswrapper[4742]: I0317 12:22:59.056281 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-k4cwb_a8125ed7-e435-4a7e-8b09-541af1b40820/cert-manager-cainjector/0.log" Mar 17 12:22:59 crc kubenswrapper[4742]: I0317 12:22:59.128316 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vf65m_09203846-9e2d-4748-b11f-c64b5a9c9c85/cert-manager-webhook/0.log" Mar 17 12:23:12 crc kubenswrapper[4742]: I0317 12:23:12.296998 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-7wxg4_a9d77ceb-2194-4bf6-809d-30ebc45c4dba/nmstate-console-plugin/0.log" Mar 17 12:23:12 crc kubenswrapper[4742]: I0317 12:23:12.514495 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-ttrvv_36b76368-76e0-42c0-944f-c799a074ff7f/kube-rbac-proxy/0.log" Mar 17 12:23:12 crc kubenswrapper[4742]: I0317 12:23:12.539446 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-b78m6_15a73401-5a6e-4a32-99ba-4efe8182c160/nmstate-handler/0.log" Mar 17 12:23:12 crc kubenswrapper[4742]: I0317 12:23:12.637956 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-ttrvv_36b76368-76e0-42c0-944f-c799a074ff7f/nmstate-metrics/0.log" Mar 17 12:23:12 crc kubenswrapper[4742]: I0317 12:23:12.720392 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-7gfv5_8ec78658-d1d9-4fa9-953c-153e38522338/nmstate-operator/0.log" Mar 17 12:23:12 crc kubenswrapper[4742]: I0317 12:23:12.815033 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-cn8bb_b2286c3d-e7d9-4ab5-827b-e6f7b9453a5b/nmstate-webhook/0.log" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.345327 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rprq7"] Mar 17 12:23:15 crc kubenswrapper[4742]: E0317 12:23:15.346871 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09ddacc-c7ba-4eae-afd2-dc4ad528c497" containerName="oc" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.346983 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09ddacc-c7ba-4eae-afd2-dc4ad528c497" containerName="oc" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.347217 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09ddacc-c7ba-4eae-afd2-dc4ad528c497" containerName="oc" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.348594 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.371280 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rprq7"] Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.482038 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkfkr\" (UniqueName: \"kubernetes.io/projected/477d657d-27a4-4597-990c-4f4e297424a0-kube-api-access-zkfkr\") pod \"certified-operators-rprq7\" (UID: \"477d657d-27a4-4597-990c-4f4e297424a0\") " pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.482116 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477d657d-27a4-4597-990c-4f4e297424a0-utilities\") pod \"certified-operators-rprq7\" (UID: \"477d657d-27a4-4597-990c-4f4e297424a0\") " pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.482242 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477d657d-27a4-4597-990c-4f4e297424a0-catalog-content\") pod \"certified-operators-rprq7\" (UID: \"477d657d-27a4-4597-990c-4f4e297424a0\") " pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.583938 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477d657d-27a4-4597-990c-4f4e297424a0-utilities\") pod \"certified-operators-rprq7\" (UID: \"477d657d-27a4-4597-990c-4f4e297424a0\") " pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.584137 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477d657d-27a4-4597-990c-4f4e297424a0-catalog-content\") pod \"certified-operators-rprq7\" (UID: \"477d657d-27a4-4597-990c-4f4e297424a0\") " pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.584289 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkfkr\" (UniqueName: \"kubernetes.io/projected/477d657d-27a4-4597-990c-4f4e297424a0-kube-api-access-zkfkr\") pod \"certified-operators-rprq7\" (UID: \"477d657d-27a4-4597-990c-4f4e297424a0\") " pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.584379 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477d657d-27a4-4597-990c-4f4e297424a0-utilities\") pod \"certified-operators-rprq7\" (UID: \"477d657d-27a4-4597-990c-4f4e297424a0\") " pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.584533 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477d657d-27a4-4597-990c-4f4e297424a0-catalog-content\") pod \"certified-operators-rprq7\" (UID: \"477d657d-27a4-4597-990c-4f4e297424a0\") " pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.608919 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkfkr\" (UniqueName: \"kubernetes.io/projected/477d657d-27a4-4597-990c-4f4e297424a0-kube-api-access-zkfkr\") pod \"certified-operators-rprq7\" (UID: \"477d657d-27a4-4597-990c-4f4e297424a0\") " pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:15 crc kubenswrapper[4742]: I0317 12:23:15.670229 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:16 crc kubenswrapper[4742]: I0317 12:23:16.253932 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rprq7"] Mar 17 12:23:16 crc kubenswrapper[4742]: I0317 12:23:16.729635 4742 generic.go:334] "Generic (PLEG): container finished" podID="477d657d-27a4-4597-990c-4f4e297424a0" containerID="625583d302dded4e652e6185e93c39dd457848d0606c4aa770a6d8d8991973bb" exitCode=0 Mar 17 12:23:16 crc kubenswrapper[4742]: I0317 12:23:16.729714 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rprq7" event={"ID":"477d657d-27a4-4597-990c-4f4e297424a0","Type":"ContainerDied","Data":"625583d302dded4e652e6185e93c39dd457848d0606c4aa770a6d8d8991973bb"} Mar 17 12:23:16 crc kubenswrapper[4742]: I0317 12:23:16.729971 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rprq7" event={"ID":"477d657d-27a4-4597-990c-4f4e297424a0","Type":"ContainerStarted","Data":"1d451a15983553d253c74027879e2e8cc6d73e696523366f37eecfcd0c19f319"} Mar 17 12:23:18 crc kubenswrapper[4742]: I0317 12:23:18.044113 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:23:18 crc kubenswrapper[4742]: I0317 12:23:18.044486 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:23:18 crc kubenswrapper[4742]: I0317 12:23:18.762332 4742 generic.go:334] "Generic (PLEG): container finished" podID="477d657d-27a4-4597-990c-4f4e297424a0" containerID="1cd6482745c444f613c9f77c565afa58e684392e277296eb59f81806167c3672" exitCode=0 Mar 17 12:23:18 crc kubenswrapper[4742]: I0317 12:23:18.762654 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rprq7" event={"ID":"477d657d-27a4-4597-990c-4f4e297424a0","Type":"ContainerDied","Data":"1cd6482745c444f613c9f77c565afa58e684392e277296eb59f81806167c3672"} Mar 17 12:23:19 crc kubenswrapper[4742]: I0317 12:23:19.782048 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rprq7" event={"ID":"477d657d-27a4-4597-990c-4f4e297424a0","Type":"ContainerStarted","Data":"9fdbf3b61a7b7ba885f5f78d3fb42e70decacb393ceaba15a6431e3ee47d1a9e"} Mar 17 12:23:19 crc kubenswrapper[4742]: I0317 12:23:19.805599 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rprq7" podStartSLOduration=2.326112306 podStartE2EDuration="4.805575554s" podCreationTimestamp="2026-03-17 12:23:15 +0000 UTC" firstStartedPulling="2026-03-17 12:23:16.731693229 +0000 UTC m=+4299.857820987" lastFinishedPulling="2026-03-17 12:23:19.211156487 +0000 UTC m=+4302.337284235" observedRunningTime="2026-03-17 12:23:19.798687234 +0000 UTC m=+4302.924815022" watchObservedRunningTime="2026-03-17 12:23:19.805575554 +0000 UTC m=+4302.931703342" Mar 17 12:23:25 crc kubenswrapper[4742]: I0317 12:23:25.670766 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:25 crc kubenswrapper[4742]: I0317 12:23:25.671432 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:25 crc kubenswrapper[4742]: I0317 12:23:25.728782 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:25 crc kubenswrapper[4742]: I0317 12:23:25.924462 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:25 crc kubenswrapper[4742]: I0317 12:23:25.990013 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rprq7"] Mar 17 12:23:27 crc kubenswrapper[4742]: I0317 12:23:27.871606 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rprq7" podUID="477d657d-27a4-4597-990c-4f4e297424a0" containerName="registry-server" containerID="cri-o://9fdbf3b61a7b7ba885f5f78d3fb42e70decacb393ceaba15a6431e3ee47d1a9e" gracePeriod=2 Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.404018 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.547367 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477d657d-27a4-4597-990c-4f4e297424a0-catalog-content\") pod \"477d657d-27a4-4597-990c-4f4e297424a0\" (UID: \"477d657d-27a4-4597-990c-4f4e297424a0\") " Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.547519 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477d657d-27a4-4597-990c-4f4e297424a0-utilities\") pod \"477d657d-27a4-4597-990c-4f4e297424a0\" (UID: \"477d657d-27a4-4597-990c-4f4e297424a0\") " Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.547558 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkfkr\" (UniqueName: \"kubernetes.io/projected/477d657d-27a4-4597-990c-4f4e297424a0-kube-api-access-zkfkr\") pod \"477d657d-27a4-4597-990c-4f4e297424a0\" (UID: \"477d657d-27a4-4597-990c-4f4e297424a0\") " Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.549325 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/477d657d-27a4-4597-990c-4f4e297424a0-utilities" (OuterVolumeSpecName: "utilities") pod "477d657d-27a4-4597-990c-4f4e297424a0" (UID: "477d657d-27a4-4597-990c-4f4e297424a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.555135 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/477d657d-27a4-4597-990c-4f4e297424a0-kube-api-access-zkfkr" (OuterVolumeSpecName: "kube-api-access-zkfkr") pod "477d657d-27a4-4597-990c-4f4e297424a0" (UID: "477d657d-27a4-4597-990c-4f4e297424a0"). InnerVolumeSpecName "kube-api-access-zkfkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.603680 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/477d657d-27a4-4597-990c-4f4e297424a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "477d657d-27a4-4597-990c-4f4e297424a0" (UID: "477d657d-27a4-4597-990c-4f4e297424a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.649691 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkfkr\" (UniqueName: \"kubernetes.io/projected/477d657d-27a4-4597-990c-4f4e297424a0-kube-api-access-zkfkr\") on node \"crc\" DevicePath \"\"" Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.649718 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/477d657d-27a4-4597-990c-4f4e297424a0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.649728 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/477d657d-27a4-4597-990c-4f4e297424a0-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.881128 4742 generic.go:334] "Generic (PLEG): container finished" podID="477d657d-27a4-4597-990c-4f4e297424a0" containerID="9fdbf3b61a7b7ba885f5f78d3fb42e70decacb393ceaba15a6431e3ee47d1a9e" exitCode=0 Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.881389 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rprq7" event={"ID":"477d657d-27a4-4597-990c-4f4e297424a0","Type":"ContainerDied","Data":"9fdbf3b61a7b7ba885f5f78d3fb42e70decacb393ceaba15a6431e3ee47d1a9e"} Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.881422 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rprq7" event={"ID":"477d657d-27a4-4597-990c-4f4e297424a0","Type":"ContainerDied","Data":"1d451a15983553d253c74027879e2e8cc6d73e696523366f37eecfcd0c19f319"} Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.881439 4742 scope.go:117] "RemoveContainer" containerID="9fdbf3b61a7b7ba885f5f78d3fb42e70decacb393ceaba15a6431e3ee47d1a9e" Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.881552 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rprq7" Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.900427 4742 scope.go:117] "RemoveContainer" containerID="1cd6482745c444f613c9f77c565afa58e684392e277296eb59f81806167c3672" Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.911168 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rprq7"] Mar 17 12:23:28 crc kubenswrapper[4742]: I0317 12:23:28.921971 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rprq7"] Mar 17 12:23:29 crc kubenswrapper[4742]: I0317 12:23:29.493848 4742 scope.go:117] "RemoveContainer" containerID="625583d302dded4e652e6185e93c39dd457848d0606c4aa770a6d8d8991973bb" Mar 17 12:23:29 crc kubenswrapper[4742]: I0317 12:23:29.619806 4742 scope.go:117] "RemoveContainer" containerID="9fdbf3b61a7b7ba885f5f78d3fb42e70decacb393ceaba15a6431e3ee47d1a9e" Mar 17 12:23:29 crc kubenswrapper[4742]: E0317 12:23:29.620409 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fdbf3b61a7b7ba885f5f78d3fb42e70decacb393ceaba15a6431e3ee47d1a9e\": container with ID starting with 9fdbf3b61a7b7ba885f5f78d3fb42e70decacb393ceaba15a6431e3ee47d1a9e not found: ID does not exist" containerID="9fdbf3b61a7b7ba885f5f78d3fb42e70decacb393ceaba15a6431e3ee47d1a9e" Mar 17 12:23:29 crc kubenswrapper[4742]: I0317 12:23:29.620469 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fdbf3b61a7b7ba885f5f78d3fb42e70decacb393ceaba15a6431e3ee47d1a9e"} err="failed to get container status \"9fdbf3b61a7b7ba885f5f78d3fb42e70decacb393ceaba15a6431e3ee47d1a9e\": rpc error: code = NotFound desc = could not find container \"9fdbf3b61a7b7ba885f5f78d3fb42e70decacb393ceaba15a6431e3ee47d1a9e\": container with ID starting with 9fdbf3b61a7b7ba885f5f78d3fb42e70decacb393ceaba15a6431e3ee47d1a9e not found: ID does not exist" Mar 17 12:23:29 crc kubenswrapper[4742]: I0317 12:23:29.620504 4742 scope.go:117] "RemoveContainer" containerID="1cd6482745c444f613c9f77c565afa58e684392e277296eb59f81806167c3672" Mar 17 12:23:29 crc kubenswrapper[4742]: E0317 12:23:29.621148 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd6482745c444f613c9f77c565afa58e684392e277296eb59f81806167c3672\": container with ID starting with 1cd6482745c444f613c9f77c565afa58e684392e277296eb59f81806167c3672 not found: ID does not exist" containerID="1cd6482745c444f613c9f77c565afa58e684392e277296eb59f81806167c3672" Mar 17 12:23:29 crc kubenswrapper[4742]: I0317 12:23:29.621191 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd6482745c444f613c9f77c565afa58e684392e277296eb59f81806167c3672"} err="failed to get container status \"1cd6482745c444f613c9f77c565afa58e684392e277296eb59f81806167c3672\": rpc error: code = NotFound desc = could not find container \"1cd6482745c444f613c9f77c565afa58e684392e277296eb59f81806167c3672\": container with ID starting with 1cd6482745c444f613c9f77c565afa58e684392e277296eb59f81806167c3672 not found: ID does not exist" Mar 17 12:23:29 crc kubenswrapper[4742]: I0317 12:23:29.621222 4742 scope.go:117] "RemoveContainer" containerID="625583d302dded4e652e6185e93c39dd457848d0606c4aa770a6d8d8991973bb" Mar 17 12:23:29 crc kubenswrapper[4742]: E0317 12:23:29.621761 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625583d302dded4e652e6185e93c39dd457848d0606c4aa770a6d8d8991973bb\": container with ID starting with 625583d302dded4e652e6185e93c39dd457848d0606c4aa770a6d8d8991973bb not found: ID does not exist" containerID="625583d302dded4e652e6185e93c39dd457848d0606c4aa770a6d8d8991973bb" Mar 17 12:23:29 crc kubenswrapper[4742]: I0317 12:23:29.621826 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625583d302dded4e652e6185e93c39dd457848d0606c4aa770a6d8d8991973bb"} err="failed to get container status \"625583d302dded4e652e6185e93c39dd457848d0606c4aa770a6d8d8991973bb\": rpc error: code = NotFound desc = could not find container \"625583d302dded4e652e6185e93c39dd457848d0606c4aa770a6d8d8991973bb\": container with ID starting with 625583d302dded4e652e6185e93c39dd457848d0606c4aa770a6d8d8991973bb not found: ID does not exist" Mar 17 12:23:30 crc kubenswrapper[4742]: I0317 12:23:30.674182 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="477d657d-27a4-4597-990c-4f4e297424a0" path="/var/lib/kubelet/pods/477d657d-27a4-4597-990c-4f4e297424a0/volumes" Mar 17 12:23:43 crc kubenswrapper[4742]: I0317 12:23:43.322842 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-497xk_80e4c493-69b8-4854-b25a-5126fd02720e/kube-rbac-proxy/0.log" Mar 17 12:23:43 crc kubenswrapper[4742]: I0317 12:23:43.442029 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-497xk_80e4c493-69b8-4854-b25a-5126fd02720e/controller/0.log" Mar 17 12:23:43 crc kubenswrapper[4742]: I0317 12:23:43.521779 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-frr-files/0.log" Mar 17 12:23:43 crc kubenswrapper[4742]: I0317 12:23:43.748661 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-reloader/0.log" Mar 17 12:23:43 crc kubenswrapper[4742]: I0317 12:23:43.756132 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-metrics/0.log" Mar 17 12:23:43 crc kubenswrapper[4742]: I0317 12:23:43.796596 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-frr-files/0.log" Mar 17 12:23:43 crc kubenswrapper[4742]: I0317 12:23:43.845036 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-reloader/0.log" Mar 17 12:23:43 crc kubenswrapper[4742]: I0317 12:23:43.994547 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-metrics/0.log" Mar 17 12:23:43 crc kubenswrapper[4742]: I0317 12:23:43.994553 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-reloader/0.log" Mar 17 12:23:44 crc kubenswrapper[4742]: I0317 12:23:44.001831 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-frr-files/0.log" Mar 17 12:23:44 crc kubenswrapper[4742]: I0317 12:23:44.032157 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-metrics/0.log" Mar 17 12:23:44 crc kubenswrapper[4742]: I0317 12:23:44.243132 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-reloader/0.log" Mar 17 12:23:44 crc kubenswrapper[4742]: I0317 12:23:44.247536 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-frr-files/0.log" Mar 17 12:23:44 crc kubenswrapper[4742]: I0317 12:23:44.249657 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/cp-metrics/0.log" Mar 17 12:23:44 crc kubenswrapper[4742]: I0317 12:23:44.294015 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/controller/0.log" Mar 17 12:23:44 crc kubenswrapper[4742]: I0317 12:23:44.447326 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/kube-rbac-proxy/0.log" Mar 17 12:23:44 crc kubenswrapper[4742]: I0317 12:23:44.465782 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/frr-metrics/0.log" Mar 17 12:23:44 crc kubenswrapper[4742]: I0317 12:23:44.547127 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/kube-rbac-proxy-frr/0.log" Mar 17 12:23:44 crc kubenswrapper[4742]: I0317 12:23:44.733783 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/reloader/0.log" Mar 17 12:23:44 crc kubenswrapper[4742]: I0317 12:23:44.783802 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-pfql6_e890c085-704d-45c9-9166-3d27780a18f6/frr-k8s-webhook-server/0.log" Mar 17 12:23:44 crc kubenswrapper[4742]: I0317 12:23:44.949479 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6cbc4688f7-5wdxf_f21bd592-6b38-41b3-a6a1-9b782891a659/manager/0.log" Mar 17 12:23:45 crc kubenswrapper[4742]: I0317 12:23:45.198849 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5df756f8d6-hq5d7_3e260a39-fc3d-48d3-90f5-151700332db7/webhook-server/0.log" Mar 17 12:23:45 crc kubenswrapper[4742]: I0317 12:23:45.205595 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-67kh2_f0349b48-f18d-415d-bb8c-2ee11d489f9e/kube-rbac-proxy/0.log" Mar 17 12:23:45 crc kubenswrapper[4742]: I0317 12:23:45.780399 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-67kh2_f0349b48-f18d-415d-bb8c-2ee11d489f9e/speaker/0.log" Mar 17 12:23:46 crc kubenswrapper[4742]: I0317 12:23:46.093295 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-8rkdr_11909043-e311-4bf8-9ecf-8b3d33d2584a/frr/0.log" Mar 17 12:23:48 crc kubenswrapper[4742]: I0317 12:23:48.043817 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:23:48 crc kubenswrapper[4742]: I0317 12:23:48.044249 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.160433 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562504-np424"] Mar 17 12:24:00 crc kubenswrapper[4742]: E0317 12:24:00.161846 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477d657d-27a4-4597-990c-4f4e297424a0" containerName="registry-server" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.161872 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="477d657d-27a4-4597-990c-4f4e297424a0" containerName="registry-server" Mar 17 12:24:00 crc kubenswrapper[4742]: E0317 12:24:00.161984 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477d657d-27a4-4597-990c-4f4e297424a0" containerName="extract-utilities" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.162002 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="477d657d-27a4-4597-990c-4f4e297424a0" containerName="extract-utilities" Mar 17 12:24:00 crc kubenswrapper[4742]: E0317 12:24:00.162053 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="477d657d-27a4-4597-990c-4f4e297424a0" containerName="extract-content" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.162068 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="477d657d-27a4-4597-990c-4f4e297424a0" containerName="extract-content" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.162411 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="477d657d-27a4-4597-990c-4f4e297424a0" containerName="registry-server" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.163556 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562504-np424" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.166694 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.167088 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.167259 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.175602 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562504-np424"] Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.291171 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k25jg\" (UniqueName: \"kubernetes.io/projected/9455626a-2ab8-4553-8b11-d4de35afd5e8-kube-api-access-k25jg\") pod \"auto-csr-approver-29562504-np424\" (UID: \"9455626a-2ab8-4553-8b11-d4de35afd5e8\") " pod="openshift-infra/auto-csr-approver-29562504-np424" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.394371 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k25jg\" (UniqueName: \"kubernetes.io/projected/9455626a-2ab8-4553-8b11-d4de35afd5e8-kube-api-access-k25jg\") pod \"auto-csr-approver-29562504-np424\" (UID: \"9455626a-2ab8-4553-8b11-d4de35afd5e8\") " pod="openshift-infra/auto-csr-approver-29562504-np424" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.425087 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k25jg\" (UniqueName: \"kubernetes.io/projected/9455626a-2ab8-4553-8b11-d4de35afd5e8-kube-api-access-k25jg\") pod \"auto-csr-approver-29562504-np424\" (UID: \"9455626a-2ab8-4553-8b11-d4de35afd5e8\") " pod="openshift-infra/auto-csr-approver-29562504-np424" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.503082 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562504-np424" Mar 17 12:24:00 crc kubenswrapper[4742]: I0317 12:24:00.968234 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562504-np424"] Mar 17 12:24:00 crc kubenswrapper[4742]: W0317 12:24:00.975718 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9455626a_2ab8_4553_8b11_d4de35afd5e8.slice/crio-0cc8d991428120f25435114ea10f488a6605676c8798ac5e8a978e28c5bf2404 WatchSource:0}: Error finding container 0cc8d991428120f25435114ea10f488a6605676c8798ac5e8a978e28c5bf2404: Status 404 returned error can't find the container with id 0cc8d991428120f25435114ea10f488a6605676c8798ac5e8a978e28c5bf2404 Mar 17 12:24:01 crc kubenswrapper[4742]: I0317 12:24:01.022059 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/util/0.log" Mar 17 12:24:01 crc kubenswrapper[4742]: I0317 12:24:01.155547 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562504-np424" event={"ID":"9455626a-2ab8-4553-8b11-d4de35afd5e8","Type":"ContainerStarted","Data":"0cc8d991428120f25435114ea10f488a6605676c8798ac5e8a978e28c5bf2404"} Mar 17 12:24:01 crc kubenswrapper[4742]: I0317 12:24:01.174452 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/util/0.log" Mar 17 12:24:01 crc kubenswrapper[4742]: I0317 12:24:01.181263 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/pull/0.log" Mar 17 12:24:01 crc kubenswrapper[4742]: I0317 12:24:01.227469 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/pull/0.log" Mar 17 12:24:01 crc kubenswrapper[4742]: I0317 12:24:01.392415 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/pull/0.log" Mar 17 12:24:01 crc kubenswrapper[4742]: I0317 12:24:01.432721 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/util/0.log" Mar 17 12:24:01 crc kubenswrapper[4742]: I0317 12:24:01.438392 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874s7dmr_20e57e18-cc27-4d2e-9207-e784beb4ce2f/extract/0.log" Mar 17 12:24:01 crc kubenswrapper[4742]: I0317 12:24:01.587559 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/util/0.log" Mar 17 12:24:01 crc kubenswrapper[4742]: I0317 12:24:01.744580 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/pull/0.log" Mar 17 12:24:01 crc kubenswrapper[4742]: I0317 12:24:01.750143 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/pull/0.log" Mar 17 12:24:01 crc kubenswrapper[4742]: I0317 12:24:01.793689 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/util/0.log" Mar 17 12:24:02 crc kubenswrapper[4742]: I0317 12:24:02.539870 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/util/0.log" Mar 17 12:24:02 crc kubenswrapper[4742]: I0317 12:24:02.591338 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/pull/0.log" Mar 17 12:24:02 crc kubenswrapper[4742]: I0317 12:24:02.609570 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1rqsbp_8011261b-573f-4e09-894b-0643fba90f8d/extract/0.log" Mar 17 12:24:02 crc kubenswrapper[4742]: I0317 12:24:02.733836 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/extract-utilities/0.log" Mar 17 12:24:02 crc kubenswrapper[4742]: I0317 12:24:02.923141 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/extract-content/0.log" Mar 17 12:24:02 crc kubenswrapper[4742]: I0317 12:24:02.926550 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/extract-content/0.log" Mar 17 12:24:02 crc kubenswrapper[4742]: I0317 12:24:02.940750 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/extract-utilities/0.log" Mar 17 12:24:03 crc kubenswrapper[4742]: I0317 12:24:03.055483 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/extract-utilities/0.log" Mar 17 12:24:03 crc kubenswrapper[4742]: I0317 12:24:03.159043 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/extract-content/0.log" Mar 17 12:24:03 crc kubenswrapper[4742]: I0317 12:24:03.185806 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562504-np424" event={"ID":"9455626a-2ab8-4553-8b11-d4de35afd5e8","Type":"ContainerStarted","Data":"8c07cefe24996b9cba2bb463f42b35eb84e76dc431848197dfe892e448a50713"} Mar 17 12:24:03 crc kubenswrapper[4742]: I0317 12:24:03.207127 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29562504-np424" podStartSLOduration=1.535823113 podStartE2EDuration="3.207110776s" podCreationTimestamp="2026-03-17 12:24:00 +0000 UTC" firstStartedPulling="2026-03-17 12:24:00.977993493 +0000 UTC m=+4344.104121241" lastFinishedPulling="2026-03-17 12:24:02.649281146 +0000 UTC m=+4345.775408904" observedRunningTime="2026-03-17 12:24:03.200782553 +0000 UTC m=+4346.326910321" watchObservedRunningTime="2026-03-17 12:24:03.207110776 +0000 UTC m=+4346.333238534" Mar 17 12:24:03 crc kubenswrapper[4742]: I0317 12:24:03.365025 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/extract-utilities/0.log" Mar 17 12:24:03 crc kubenswrapper[4742]: I0317 12:24:03.433214 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/extract-utilities/0.log" Mar 17 12:24:03 crc kubenswrapper[4742]: I0317 12:24:03.542671 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/extract-content/0.log" Mar 17 12:24:03 crc kubenswrapper[4742]: I0317 12:24:03.580413 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/extract-content/0.log" Mar 17 12:24:03 crc kubenswrapper[4742]: I0317 12:24:03.755822 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/extract-utilities/0.log" Mar 17 12:24:03 crc kubenswrapper[4742]: I0317 12:24:03.804571 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5nq4d_ebd9754c-6bff-490f-a8c5-5aa16bb9170e/registry-server/0.log" Mar 17 12:24:03 crc kubenswrapper[4742]: I0317 12:24:03.840462 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/extract-content/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.036420 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rxctp_66e4c4dd-b0fe-4877-8520-bdbd18b096d4/marketplace-operator/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.198695 4742 generic.go:334] "Generic (PLEG): container finished" podID="9455626a-2ab8-4553-8b11-d4de35afd5e8" containerID="8c07cefe24996b9cba2bb463f42b35eb84e76dc431848197dfe892e448a50713" exitCode=0 Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.198736 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562504-np424" event={"ID":"9455626a-2ab8-4553-8b11-d4de35afd5e8","Type":"ContainerDied","Data":"8c07cefe24996b9cba2bb463f42b35eb84e76dc431848197dfe892e448a50713"} Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.232598 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/extract-utilities/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.235114 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-f2tmr_3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/registry-server/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.385667 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/extract-content/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.406188 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/extract-content/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.450986 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/extract-utilities/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.585414 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/extract-content/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.588262 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/extract-utilities/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.680201 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xqrwr_b2345c7c-e927-4754-af27-9c836794d9c8/extract-utilities/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.768553 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rhqsm_e827c1af-bb51-4f3d-bf81-708986989404/registry-server/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.850998 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xqrwr_b2345c7c-e927-4754-af27-9c836794d9c8/extract-content/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.886367 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xqrwr_b2345c7c-e927-4754-af27-9c836794d9c8/extract-utilities/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.904628 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xqrwr_b2345c7c-e927-4754-af27-9c836794d9c8/extract-content/0.log" Mar 17 12:24:04 crc kubenswrapper[4742]: I0317 12:24:04.981560 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xqrwr_b2345c7c-e927-4754-af27-9c836794d9c8/extract-utilities/0.log" Mar 17 12:24:05 crc kubenswrapper[4742]: I0317 12:24:05.005057 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xqrwr_b2345c7c-e927-4754-af27-9c836794d9c8/extract-content/0.log" Mar 17 12:24:05 crc kubenswrapper[4742]: I0317 12:24:05.166154 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xqrwr_b2345c7c-e927-4754-af27-9c836794d9c8/registry-server/0.log" Mar 17 12:24:05 crc kubenswrapper[4742]: I0317 12:24:05.550247 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562504-np424" Mar 17 12:24:05 crc kubenswrapper[4742]: I0317 12:24:05.690011 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k25jg\" (UniqueName: \"kubernetes.io/projected/9455626a-2ab8-4553-8b11-d4de35afd5e8-kube-api-access-k25jg\") pod \"9455626a-2ab8-4553-8b11-d4de35afd5e8\" (UID: \"9455626a-2ab8-4553-8b11-d4de35afd5e8\") " Mar 17 12:24:05 crc kubenswrapper[4742]: I0317 12:24:05.698864 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9455626a-2ab8-4553-8b11-d4de35afd5e8-kube-api-access-k25jg" (OuterVolumeSpecName: "kube-api-access-k25jg") pod "9455626a-2ab8-4553-8b11-d4de35afd5e8" (UID: "9455626a-2ab8-4553-8b11-d4de35afd5e8"). InnerVolumeSpecName "kube-api-access-k25jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:24:05 crc kubenswrapper[4742]: I0317 12:24:05.793665 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k25jg\" (UniqueName: \"kubernetes.io/projected/9455626a-2ab8-4553-8b11-d4de35afd5e8-kube-api-access-k25jg\") on node \"crc\" DevicePath \"\"" Mar 17 12:24:06 crc kubenswrapper[4742]: I0317 12:24:06.217953 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562504-np424" event={"ID":"9455626a-2ab8-4553-8b11-d4de35afd5e8","Type":"ContainerDied","Data":"0cc8d991428120f25435114ea10f488a6605676c8798ac5e8a978e28c5bf2404"} Mar 17 12:24:06 crc kubenswrapper[4742]: I0317 12:24:06.217990 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cc8d991428120f25435114ea10f488a6605676c8798ac5e8a978e28c5bf2404" Mar 17 12:24:06 crc kubenswrapper[4742]: I0317 12:24:06.218099 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562504-np424" Mar 17 12:24:06 crc kubenswrapper[4742]: I0317 12:24:06.283859 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562498-fnkmh"] Mar 17 12:24:06 crc kubenswrapper[4742]: I0317 12:24:06.295702 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562498-fnkmh"] Mar 17 12:24:06 crc kubenswrapper[4742]: I0317 12:24:06.678621 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a80dc9c-7f4a-48a4-9496-ef8edc764a15" path="/var/lib/kubelet/pods/6a80dc9c-7f4a-48a4-9496-ef8edc764a15/volumes" Mar 17 12:24:18 crc kubenswrapper[4742]: I0317 12:24:18.044370 4742 patch_prober.go:28] interesting pod/machine-config-daemon-5jxxw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 17 12:24:18 crc kubenswrapper[4742]: I0317 12:24:18.044978 4742 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 17 12:24:18 crc kubenswrapper[4742]: I0317 12:24:18.045044 4742 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" Mar 17 12:24:18 crc kubenswrapper[4742]: I0317 12:24:18.046065 4742 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0"} pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 17 12:24:18 crc kubenswrapper[4742]: I0317 12:24:18.046148 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerName="machine-config-daemon" containerID="cri-o://ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" gracePeriod=600 Mar 17 12:24:18 crc kubenswrapper[4742]: E0317 12:24:18.172792 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:24:18 crc kubenswrapper[4742]: I0317 12:24:18.345184 4742 generic.go:334] "Generic (PLEG): container finished" podID="5e11ad39-38bb-4b70-9cac-ce078b37f882" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" exitCode=0 Mar 17 12:24:18 crc kubenswrapper[4742]: I0317 12:24:18.345232 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerDied","Data":"ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0"} Mar 17 12:24:18 crc kubenswrapper[4742]: I0317 12:24:18.345318 4742 scope.go:117] "RemoveContainer" containerID="b4c9cacb2ad65768276b5012b7d6a56bc72a471d174cfcf7c4bf8a60597c5822" Mar 17 12:24:18 crc kubenswrapper[4742]: I0317 12:24:18.346127 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:24:18 crc kubenswrapper[4742]: E0317 12:24:18.346731 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:24:33 crc kubenswrapper[4742]: I0317 12:24:33.663018 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:24:33 crc kubenswrapper[4742]: E0317 12:24:33.663727 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:24:45 crc kubenswrapper[4742]: I0317 12:24:45.664105 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:24:45 crc kubenswrapper[4742]: E0317 12:24:45.664818 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:24:58 crc kubenswrapper[4742]: I0317 12:24:58.386871 4742 scope.go:117] "RemoveContainer" containerID="c8de0bdb1a4b867408bb92d47c1094c836a9752e1024f8b322cb9415d4eb6fb1" Mar 17 12:25:00 crc kubenswrapper[4742]: I0317 12:25:00.664665 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:25:00 crc kubenswrapper[4742]: E0317 12:25:00.665192 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:25:11 crc kubenswrapper[4742]: I0317 12:25:11.663819 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:25:11 crc kubenswrapper[4742]: E0317 12:25:11.665097 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:25:22 crc kubenswrapper[4742]: I0317 12:25:22.664701 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:25:22 crc kubenswrapper[4742]: E0317 12:25:22.665940 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:25:36 crc kubenswrapper[4742]: I0317 12:25:36.663346 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:25:36 crc kubenswrapper[4742]: E0317 12:25:36.664419 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:25:51 crc kubenswrapper[4742]: I0317 12:25:51.662437 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:25:51 crc kubenswrapper[4742]: E0317 12:25:51.664851 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:25:56 crc kubenswrapper[4742]: I0317 12:25:56.385660 4742 generic.go:334] "Generic (PLEG): container finished" podID="09d769ba-43cf-4abc-aec6-f21879cc4c38" containerID="2b007bf8242742771d437d12d0e69b64e5f28004bccd213e799b8a450c45389d" exitCode=0 Mar 17 12:25:56 crc kubenswrapper[4742]: I0317 12:25:56.385789 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rcph6/must-gather-nvl7j" event={"ID":"09d769ba-43cf-4abc-aec6-f21879cc4c38","Type":"ContainerDied","Data":"2b007bf8242742771d437d12d0e69b64e5f28004bccd213e799b8a450c45389d"} Mar 17 12:25:56 crc kubenswrapper[4742]: I0317 12:25:56.387472 4742 scope.go:117] "RemoveContainer" containerID="2b007bf8242742771d437d12d0e69b64e5f28004bccd213e799b8a450c45389d" Mar 17 12:25:56 crc kubenswrapper[4742]: I0317 12:25:56.679468 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rcph6_must-gather-nvl7j_09d769ba-43cf-4abc-aec6-f21879cc4c38/gather/0.log" Mar 17 12:26:00 crc kubenswrapper[4742]: I0317 12:26:00.164038 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562506-sfmvb"] Mar 17 12:26:00 crc kubenswrapper[4742]: E0317 12:26:00.165248 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9455626a-2ab8-4553-8b11-d4de35afd5e8" containerName="oc" Mar 17 12:26:00 crc kubenswrapper[4742]: I0317 12:26:00.165268 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="9455626a-2ab8-4553-8b11-d4de35afd5e8" containerName="oc" Mar 17 12:26:00 crc kubenswrapper[4742]: I0317 12:26:00.165600 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="9455626a-2ab8-4553-8b11-d4de35afd5e8" containerName="oc" Mar 17 12:26:00 crc kubenswrapper[4742]: I0317 12:26:00.166509 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562506-sfmvb" Mar 17 12:26:00 crc kubenswrapper[4742]: I0317 12:26:00.169582 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:26:00 crc kubenswrapper[4742]: I0317 12:26:00.169762 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:26:00 crc kubenswrapper[4742]: I0317 12:26:00.170749 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:26:00 crc kubenswrapper[4742]: I0317 12:26:00.192101 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562506-sfmvb"] Mar 17 12:26:00 crc kubenswrapper[4742]: I0317 12:26:00.261406 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhcpq\" (UniqueName: \"kubernetes.io/projected/9504a2d2-8cdc-4516-82d3-0f48602857c6-kube-api-access-bhcpq\") pod \"auto-csr-approver-29562506-sfmvb\" (UID: \"9504a2d2-8cdc-4516-82d3-0f48602857c6\") " pod="openshift-infra/auto-csr-approver-29562506-sfmvb" Mar 17 12:26:00 crc kubenswrapper[4742]: I0317 12:26:00.363866 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhcpq\" (UniqueName: \"kubernetes.io/projected/9504a2d2-8cdc-4516-82d3-0f48602857c6-kube-api-access-bhcpq\") pod \"auto-csr-approver-29562506-sfmvb\" (UID: \"9504a2d2-8cdc-4516-82d3-0f48602857c6\") " pod="openshift-infra/auto-csr-approver-29562506-sfmvb" Mar 17 12:26:00 crc kubenswrapper[4742]: I0317 12:26:00.390675 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhcpq\" (UniqueName: \"kubernetes.io/projected/9504a2d2-8cdc-4516-82d3-0f48602857c6-kube-api-access-bhcpq\") pod \"auto-csr-approver-29562506-sfmvb\" (UID: \"9504a2d2-8cdc-4516-82d3-0f48602857c6\") " pod="openshift-infra/auto-csr-approver-29562506-sfmvb" Mar 17 12:26:00 crc kubenswrapper[4742]: I0317 12:26:00.518719 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562506-sfmvb" Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.034300 4742 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.043767 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562506-sfmvb"] Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.059196 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z4rjw"] Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.061519 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.077410 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4rjw"] Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.196690 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbsfq\" (UniqueName: \"kubernetes.io/projected/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-kube-api-access-sbsfq\") pod \"redhat-marketplace-z4rjw\" (UID: \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\") " pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.196804 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-catalog-content\") pod \"redhat-marketplace-z4rjw\" (UID: \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\") " pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.197015 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-utilities\") pod \"redhat-marketplace-z4rjw\" (UID: \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\") " pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.299244 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-utilities\") pod \"redhat-marketplace-z4rjw\" (UID: \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\") " pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.299378 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbsfq\" (UniqueName: \"kubernetes.io/projected/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-kube-api-access-sbsfq\") pod \"redhat-marketplace-z4rjw\" (UID: \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\") " pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.299436 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-catalog-content\") pod \"redhat-marketplace-z4rjw\" (UID: \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\") " pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.300032 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-catalog-content\") pod \"redhat-marketplace-z4rjw\" (UID: \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\") " pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.300063 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-utilities\") pod \"redhat-marketplace-z4rjw\" (UID: \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\") " pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.318575 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbsfq\" (UniqueName: \"kubernetes.io/projected/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-kube-api-access-sbsfq\") pod \"redhat-marketplace-z4rjw\" (UID: \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\") " pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.412876 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.447985 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562506-sfmvb" event={"ID":"9504a2d2-8cdc-4516-82d3-0f48602857c6","Type":"ContainerStarted","Data":"f6cb236297d4dd10b1041f86d67d70baed904dc0e15143ad010cac66024d2e24"} Mar 17 12:26:01 crc kubenswrapper[4742]: I0317 12:26:01.911762 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4rjw"] Mar 17 12:26:01 crc kubenswrapper[4742]: W0317 12:26:01.917008 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a3b6cbb_6fe3_4264_90fa_e9f12f36b600.slice/crio-4f096a74c3b8fcd6a9c125e0a2dd207519086c949b8c3d2304ddd40331533488 WatchSource:0}: Error finding container 4f096a74c3b8fcd6a9c125e0a2dd207519086c949b8c3d2304ddd40331533488: Status 404 returned error can't find the container with id 4f096a74c3b8fcd6a9c125e0a2dd207519086c949b8c3d2304ddd40331533488 Mar 17 12:26:02 crc kubenswrapper[4742]: I0317 12:26:02.460011 4742 generic.go:334] "Generic (PLEG): container finished" podID="6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" containerID="ed069b4cba40f4015293bd931120de6b0c658093fc94180f036b70c5d5344049" exitCode=0 Mar 17 12:26:02 crc kubenswrapper[4742]: I0317 12:26:02.460092 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4rjw" event={"ID":"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600","Type":"ContainerDied","Data":"ed069b4cba40f4015293bd931120de6b0c658093fc94180f036b70c5d5344049"} Mar 17 12:26:02 crc kubenswrapper[4742]: I0317 12:26:02.460356 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4rjw" event={"ID":"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600","Type":"ContainerStarted","Data":"4f096a74c3b8fcd6a9c125e0a2dd207519086c949b8c3d2304ddd40331533488"} Mar 17 12:26:02 crc kubenswrapper[4742]: I0317 12:26:02.472627 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562506-sfmvb" event={"ID":"9504a2d2-8cdc-4516-82d3-0f48602857c6","Type":"ContainerStarted","Data":"42385aef9b228ece28f7cef37c71cc9345e5d8ff573e3717ae1f66de36e397d2"} Mar 17 12:26:03 crc kubenswrapper[4742]: I0317 12:26:03.488141 4742 generic.go:334] "Generic (PLEG): container finished" podID="9504a2d2-8cdc-4516-82d3-0f48602857c6" containerID="42385aef9b228ece28f7cef37c71cc9345e5d8ff573e3717ae1f66de36e397d2" exitCode=0 Mar 17 12:26:03 crc kubenswrapper[4742]: I0317 12:26:03.488523 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562506-sfmvb" event={"ID":"9504a2d2-8cdc-4516-82d3-0f48602857c6","Type":"ContainerDied","Data":"42385aef9b228ece28f7cef37c71cc9345e5d8ff573e3717ae1f66de36e397d2"} Mar 17 12:26:04 crc kubenswrapper[4742]: I0317 12:26:04.504709 4742 generic.go:334] "Generic (PLEG): container finished" podID="6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" containerID="a002ab58d97d47b88ff131efc59eec4f5e8249d7863f7c087fe0c36a3af860ae" exitCode=0 Mar 17 12:26:04 crc kubenswrapper[4742]: I0317 12:26:04.504808 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4rjw" event={"ID":"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600","Type":"ContainerDied","Data":"a002ab58d97d47b88ff131efc59eec4f5e8249d7863f7c087fe0c36a3af860ae"} Mar 17 12:26:04 crc kubenswrapper[4742]: I0317 12:26:04.883361 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562506-sfmvb" Mar 17 12:26:04 crc kubenswrapper[4742]: I0317 12:26:04.984338 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhcpq\" (UniqueName: \"kubernetes.io/projected/9504a2d2-8cdc-4516-82d3-0f48602857c6-kube-api-access-bhcpq\") pod \"9504a2d2-8cdc-4516-82d3-0f48602857c6\" (UID: \"9504a2d2-8cdc-4516-82d3-0f48602857c6\") " Mar 17 12:26:04 crc kubenswrapper[4742]: I0317 12:26:04.990160 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9504a2d2-8cdc-4516-82d3-0f48602857c6-kube-api-access-bhcpq" (OuterVolumeSpecName: "kube-api-access-bhcpq") pod "9504a2d2-8cdc-4516-82d3-0f48602857c6" (UID: "9504a2d2-8cdc-4516-82d3-0f48602857c6"). InnerVolumeSpecName "kube-api-access-bhcpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:26:05 crc kubenswrapper[4742]: I0317 12:26:05.086823 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhcpq\" (UniqueName: \"kubernetes.io/projected/9504a2d2-8cdc-4516-82d3-0f48602857c6-kube-api-access-bhcpq\") on node \"crc\" DevicePath \"\"" Mar 17 12:26:05 crc kubenswrapper[4742]: I0317 12:26:05.521797 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4rjw" event={"ID":"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600","Type":"ContainerStarted","Data":"fd83ca315ba90e620816a1e4cdf4716f1b5fda617dde480bcaf50f681e4dd57f"} Mar 17 12:26:05 crc kubenswrapper[4742]: I0317 12:26:05.525204 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562506-sfmvb" event={"ID":"9504a2d2-8cdc-4516-82d3-0f48602857c6","Type":"ContainerDied","Data":"f6cb236297d4dd10b1041f86d67d70baed904dc0e15143ad010cac66024d2e24"} Mar 17 12:26:05 crc kubenswrapper[4742]: I0317 12:26:05.525539 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562506-sfmvb" Mar 17 12:26:05 crc kubenswrapper[4742]: I0317 12:26:05.525557 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6cb236297d4dd10b1041f86d67d70baed904dc0e15143ad010cac66024d2e24" Mar 17 12:26:05 crc kubenswrapper[4742]: I0317 12:26:05.558135 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z4rjw" podStartSLOduration=2.098959701 podStartE2EDuration="4.558105771s" podCreationTimestamp="2026-03-17 12:26:01 +0000 UTC" firstStartedPulling="2026-03-17 12:26:02.462116877 +0000 UTC m=+4465.588244645" lastFinishedPulling="2026-03-17 12:26:04.921262957 +0000 UTC m=+4468.047390715" observedRunningTime="2026-03-17 12:26:05.548779474 +0000 UTC m=+4468.674907272" watchObservedRunningTime="2026-03-17 12:26:05.558105771 +0000 UTC m=+4468.684233569" Mar 17 12:26:05 crc kubenswrapper[4742]: I0317 12:26:05.601881 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562500-mpzhw"] Mar 17 12:26:05 crc kubenswrapper[4742]: I0317 12:26:05.612529 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562500-mpzhw"] Mar 17 12:26:05 crc kubenswrapper[4742]: I0317 12:26:05.663488 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:26:05 crc kubenswrapper[4742]: E0317 12:26:05.663779 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:26:06 crc kubenswrapper[4742]: I0317 12:26:06.673646 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0252c4c4-de55-42c1-97c0-34bc9d6ea579" path="/var/lib/kubelet/pods/0252c4c4-de55-42c1-97c0-34bc9d6ea579/volumes" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.041672 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xgt2l"] Mar 17 12:26:07 crc kubenswrapper[4742]: E0317 12:26:07.046039 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9504a2d2-8cdc-4516-82d3-0f48602857c6" containerName="oc" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.046090 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="9504a2d2-8cdc-4516-82d3-0f48602857c6" containerName="oc" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.046514 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="9504a2d2-8cdc-4516-82d3-0f48602857c6" containerName="oc" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.048807 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.054373 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgt2l"] Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.124479 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm42t\" (UniqueName: \"kubernetes.io/projected/3eb6c2c5-9891-4e25-91d9-f4fc2e63d549-kube-api-access-fm42t\") pod \"community-operators-xgt2l\" (UID: \"3eb6c2c5-9891-4e25-91d9-f4fc2e63d549\") " pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.124548 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eb6c2c5-9891-4e25-91d9-f4fc2e63d549-catalog-content\") pod \"community-operators-xgt2l\" (UID: \"3eb6c2c5-9891-4e25-91d9-f4fc2e63d549\") " pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.124576 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eb6c2c5-9891-4e25-91d9-f4fc2e63d549-utilities\") pod \"community-operators-xgt2l\" (UID: \"3eb6c2c5-9891-4e25-91d9-f4fc2e63d549\") " pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.225796 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eb6c2c5-9891-4e25-91d9-f4fc2e63d549-catalog-content\") pod \"community-operators-xgt2l\" (UID: \"3eb6c2c5-9891-4e25-91d9-f4fc2e63d549\") " pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.225847 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eb6c2c5-9891-4e25-91d9-f4fc2e63d549-utilities\") pod \"community-operators-xgt2l\" (UID: \"3eb6c2c5-9891-4e25-91d9-f4fc2e63d549\") " pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.226014 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm42t\" (UniqueName: \"kubernetes.io/projected/3eb6c2c5-9891-4e25-91d9-f4fc2e63d549-kube-api-access-fm42t\") pod \"community-operators-xgt2l\" (UID: \"3eb6c2c5-9891-4e25-91d9-f4fc2e63d549\") " pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.226333 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3eb6c2c5-9891-4e25-91d9-f4fc2e63d549-catalog-content\") pod \"community-operators-xgt2l\" (UID: \"3eb6c2c5-9891-4e25-91d9-f4fc2e63d549\") " pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.226644 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3eb6c2c5-9891-4e25-91d9-f4fc2e63d549-utilities\") pod \"community-operators-xgt2l\" (UID: \"3eb6c2c5-9891-4e25-91d9-f4fc2e63d549\") " pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.251372 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm42t\" (UniqueName: \"kubernetes.io/projected/3eb6c2c5-9891-4e25-91d9-f4fc2e63d549-kube-api-access-fm42t\") pod \"community-operators-xgt2l\" (UID: \"3eb6c2c5-9891-4e25-91d9-f4fc2e63d549\") " pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.379675 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:07 crc kubenswrapper[4742]: I0317 12:26:07.665808 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgt2l"] Mar 17 12:26:08 crc kubenswrapper[4742]: I0317 12:26:08.030391 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rcph6/must-gather-nvl7j"] Mar 17 12:26:08 crc kubenswrapper[4742]: I0317 12:26:08.030805 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rcph6/must-gather-nvl7j" podUID="09d769ba-43cf-4abc-aec6-f21879cc4c38" containerName="copy" containerID="cri-o://fe2544bc03d670fc74002cffe9750fc2f05ac02966dd14a2e4670387d3d9ccbf" gracePeriod=2 Mar 17 12:26:08 crc kubenswrapper[4742]: I0317 12:26:08.043183 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rcph6/must-gather-nvl7j"] Mar 17 12:26:08 crc kubenswrapper[4742]: I0317 12:26:08.575830 4742 generic.go:334] "Generic (PLEG): container finished" podID="3eb6c2c5-9891-4e25-91d9-f4fc2e63d549" containerID="250d275a63966ee753370fd6a6f9d3d8db754f4b4e9ad8aaa27ef862dbd0f449" exitCode=0 Mar 17 12:26:08 crc kubenswrapper[4742]: I0317 12:26:08.575868 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgt2l" event={"ID":"3eb6c2c5-9891-4e25-91d9-f4fc2e63d549","Type":"ContainerDied","Data":"250d275a63966ee753370fd6a6f9d3d8db754f4b4e9ad8aaa27ef862dbd0f449"} Mar 17 12:26:08 crc kubenswrapper[4742]: I0317 12:26:08.575939 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgt2l" event={"ID":"3eb6c2c5-9891-4e25-91d9-f4fc2e63d549","Type":"ContainerStarted","Data":"9adfb9749ba72e799985e08cce8a908b2bce5402bc6858c23cbbb20fab15e540"} Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.034992 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rcph6_must-gather-nvl7j_09d769ba-43cf-4abc-aec6-f21879cc4c38/copy/0.log" Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.035654 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/must-gather-nvl7j" Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.160973 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09d769ba-43cf-4abc-aec6-f21879cc4c38-must-gather-output\") pod \"09d769ba-43cf-4abc-aec6-f21879cc4c38\" (UID: \"09d769ba-43cf-4abc-aec6-f21879cc4c38\") " Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.161158 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fghvc\" (UniqueName: \"kubernetes.io/projected/09d769ba-43cf-4abc-aec6-f21879cc4c38-kube-api-access-fghvc\") pod \"09d769ba-43cf-4abc-aec6-f21879cc4c38\" (UID: \"09d769ba-43cf-4abc-aec6-f21879cc4c38\") " Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.166157 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d769ba-43cf-4abc-aec6-f21879cc4c38-kube-api-access-fghvc" (OuterVolumeSpecName: "kube-api-access-fghvc") pod "09d769ba-43cf-4abc-aec6-f21879cc4c38" (UID: "09d769ba-43cf-4abc-aec6-f21879cc4c38"). InnerVolumeSpecName "kube-api-access-fghvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.263611 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fghvc\" (UniqueName: \"kubernetes.io/projected/09d769ba-43cf-4abc-aec6-f21879cc4c38-kube-api-access-fghvc\") on node \"crc\" DevicePath \"\"" Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.336684 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d769ba-43cf-4abc-aec6-f21879cc4c38-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "09d769ba-43cf-4abc-aec6-f21879cc4c38" (UID: "09d769ba-43cf-4abc-aec6-f21879cc4c38"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.365291 4742 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09d769ba-43cf-4abc-aec6-f21879cc4c38-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.586018 4742 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rcph6_must-gather-nvl7j_09d769ba-43cf-4abc-aec6-f21879cc4c38/copy/0.log" Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.586593 4742 generic.go:334] "Generic (PLEG): container finished" podID="09d769ba-43cf-4abc-aec6-f21879cc4c38" containerID="fe2544bc03d670fc74002cffe9750fc2f05ac02966dd14a2e4670387d3d9ccbf" exitCode=143 Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.586640 4742 scope.go:117] "RemoveContainer" containerID="fe2544bc03d670fc74002cffe9750fc2f05ac02966dd14a2e4670387d3d9ccbf" Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.586666 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rcph6/must-gather-nvl7j" Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.628546 4742 scope.go:117] "RemoveContainer" containerID="2b007bf8242742771d437d12d0e69b64e5f28004bccd213e799b8a450c45389d" Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.682702 4742 scope.go:117] "RemoveContainer" containerID="fe2544bc03d670fc74002cffe9750fc2f05ac02966dd14a2e4670387d3d9ccbf" Mar 17 12:26:09 crc kubenswrapper[4742]: E0317 12:26:09.683172 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe2544bc03d670fc74002cffe9750fc2f05ac02966dd14a2e4670387d3d9ccbf\": container with ID starting with fe2544bc03d670fc74002cffe9750fc2f05ac02966dd14a2e4670387d3d9ccbf not found: ID does not exist" containerID="fe2544bc03d670fc74002cffe9750fc2f05ac02966dd14a2e4670387d3d9ccbf" Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.683204 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe2544bc03d670fc74002cffe9750fc2f05ac02966dd14a2e4670387d3d9ccbf"} err="failed to get container status \"fe2544bc03d670fc74002cffe9750fc2f05ac02966dd14a2e4670387d3d9ccbf\": rpc error: code = NotFound desc = could not find container \"fe2544bc03d670fc74002cffe9750fc2f05ac02966dd14a2e4670387d3d9ccbf\": container with ID starting with fe2544bc03d670fc74002cffe9750fc2f05ac02966dd14a2e4670387d3d9ccbf not found: ID does not exist" Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.683231 4742 scope.go:117] "RemoveContainer" containerID="2b007bf8242742771d437d12d0e69b64e5f28004bccd213e799b8a450c45389d" Mar 17 12:26:09 crc kubenswrapper[4742]: E0317 12:26:09.683649 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b007bf8242742771d437d12d0e69b64e5f28004bccd213e799b8a450c45389d\": container with ID starting with 2b007bf8242742771d437d12d0e69b64e5f28004bccd213e799b8a450c45389d not found: ID does not exist" containerID="2b007bf8242742771d437d12d0e69b64e5f28004bccd213e799b8a450c45389d" Mar 17 12:26:09 crc kubenswrapper[4742]: I0317 12:26:09.683671 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b007bf8242742771d437d12d0e69b64e5f28004bccd213e799b8a450c45389d"} err="failed to get container status \"2b007bf8242742771d437d12d0e69b64e5f28004bccd213e799b8a450c45389d\": rpc error: code = NotFound desc = could not find container \"2b007bf8242742771d437d12d0e69b64e5f28004bccd213e799b8a450c45389d\": container with ID starting with 2b007bf8242742771d437d12d0e69b64e5f28004bccd213e799b8a450c45389d not found: ID does not exist" Mar 17 12:26:10 crc kubenswrapper[4742]: I0317 12:26:10.676808 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09d769ba-43cf-4abc-aec6-f21879cc4c38" path="/var/lib/kubelet/pods/09d769ba-43cf-4abc-aec6-f21879cc4c38/volumes" Mar 17 12:26:11 crc kubenswrapper[4742]: I0317 12:26:11.415098 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:11 crc kubenswrapper[4742]: I0317 12:26:11.415155 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:11 crc kubenswrapper[4742]: I0317 12:26:11.471258 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:11 crc kubenswrapper[4742]: I0317 12:26:11.657406 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:13 crc kubenswrapper[4742]: I0317 12:26:13.432824 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4rjw"] Mar 17 12:26:13 crc kubenswrapper[4742]: I0317 12:26:13.631927 4742 generic.go:334] "Generic (PLEG): container finished" podID="3eb6c2c5-9891-4e25-91d9-f4fc2e63d549" containerID="f9e272ab09ddedc4e471da16b4091f1beda8e72073a610bbbf13fc75a4bfa0da" exitCode=0 Mar 17 12:26:13 crc kubenswrapper[4742]: I0317 12:26:13.632030 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgt2l" event={"ID":"3eb6c2c5-9891-4e25-91d9-f4fc2e63d549","Type":"ContainerDied","Data":"f9e272ab09ddedc4e471da16b4091f1beda8e72073a610bbbf13fc75a4bfa0da"} Mar 17 12:26:13 crc kubenswrapper[4742]: I0317 12:26:13.632491 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z4rjw" podUID="6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" containerName="registry-server" containerID="cri-o://fd83ca315ba90e620816a1e4cdf4716f1b5fda617dde480bcaf50f681e4dd57f" gracePeriod=2 Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.158856 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.315663 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbsfq\" (UniqueName: \"kubernetes.io/projected/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-kube-api-access-sbsfq\") pod \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\" (UID: \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\") " Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.315755 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-catalog-content\") pod \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\" (UID: \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\") " Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.315920 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-utilities\") pod \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\" (UID: \"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600\") " Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.317482 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-utilities" (OuterVolumeSpecName: "utilities") pod "6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" (UID: "6a3b6cbb-6fe3-4264-90fa-e9f12f36b600"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.327883 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-kube-api-access-sbsfq" (OuterVolumeSpecName: "kube-api-access-sbsfq") pod "6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" (UID: "6a3b6cbb-6fe3-4264-90fa-e9f12f36b600"). InnerVolumeSpecName "kube-api-access-sbsfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.418369 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.418410 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbsfq\" (UniqueName: \"kubernetes.io/projected/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-kube-api-access-sbsfq\") on node \"crc\" DevicePath \"\"" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.460761 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" (UID: "6a3b6cbb-6fe3-4264-90fa-e9f12f36b600"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.520108 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.643048 4742 generic.go:334] "Generic (PLEG): container finished" podID="6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" containerID="fd83ca315ba90e620816a1e4cdf4716f1b5fda617dde480bcaf50f681e4dd57f" exitCode=0 Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.643119 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4rjw" event={"ID":"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600","Type":"ContainerDied","Data":"fd83ca315ba90e620816a1e4cdf4716f1b5fda617dde480bcaf50f681e4dd57f"} Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.643149 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4rjw" event={"ID":"6a3b6cbb-6fe3-4264-90fa-e9f12f36b600","Type":"ContainerDied","Data":"4f096a74c3b8fcd6a9c125e0a2dd207519086c949b8c3d2304ddd40331533488"} Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.643170 4742 scope.go:117] "RemoveContainer" containerID="fd83ca315ba90e620816a1e4cdf4716f1b5fda617dde480bcaf50f681e4dd57f" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.643307 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z4rjw" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.654665 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xgt2l" event={"ID":"3eb6c2c5-9891-4e25-91d9-f4fc2e63d549","Type":"ContainerStarted","Data":"2de6d849beffc86009508143ba28484d60e6297f4c63d6b097dcd3aa48d60e48"} Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.675443 4742 scope.go:117] "RemoveContainer" containerID="a002ab58d97d47b88ff131efc59eec4f5e8249d7863f7c087fe0c36a3af860ae" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.708032 4742 scope.go:117] "RemoveContainer" containerID="ed069b4cba40f4015293bd931120de6b0c658093fc94180f036b70c5d5344049" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.718949 4742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xgt2l" podStartSLOduration=2.205051215 podStartE2EDuration="7.718922883s" podCreationTimestamp="2026-03-17 12:26:07 +0000 UTC" firstStartedPulling="2026-03-17 12:26:08.577889948 +0000 UTC m=+4471.704017696" lastFinishedPulling="2026-03-17 12:26:14.091761596 +0000 UTC m=+4477.217889364" observedRunningTime="2026-03-17 12:26:14.699943732 +0000 UTC m=+4477.826071510" watchObservedRunningTime="2026-03-17 12:26:14.718922883 +0000 UTC m=+4477.845050661" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.732872 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4rjw"] Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.742019 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4rjw"] Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.751210 4742 scope.go:117] "RemoveContainer" containerID="fd83ca315ba90e620816a1e4cdf4716f1b5fda617dde480bcaf50f681e4dd57f" Mar 17 12:26:14 crc kubenswrapper[4742]: E0317 12:26:14.751656 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd83ca315ba90e620816a1e4cdf4716f1b5fda617dde480bcaf50f681e4dd57f\": container with ID starting with fd83ca315ba90e620816a1e4cdf4716f1b5fda617dde480bcaf50f681e4dd57f not found: ID does not exist" containerID="fd83ca315ba90e620816a1e4cdf4716f1b5fda617dde480bcaf50f681e4dd57f" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.751701 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd83ca315ba90e620816a1e4cdf4716f1b5fda617dde480bcaf50f681e4dd57f"} err="failed to get container status \"fd83ca315ba90e620816a1e4cdf4716f1b5fda617dde480bcaf50f681e4dd57f\": rpc error: code = NotFound desc = could not find container \"fd83ca315ba90e620816a1e4cdf4716f1b5fda617dde480bcaf50f681e4dd57f\": container with ID starting with fd83ca315ba90e620816a1e4cdf4716f1b5fda617dde480bcaf50f681e4dd57f not found: ID does not exist" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.751741 4742 scope.go:117] "RemoveContainer" containerID="a002ab58d97d47b88ff131efc59eec4f5e8249d7863f7c087fe0c36a3af860ae" Mar 17 12:26:14 crc kubenswrapper[4742]: E0317 12:26:14.752727 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a002ab58d97d47b88ff131efc59eec4f5e8249d7863f7c087fe0c36a3af860ae\": container with ID starting with a002ab58d97d47b88ff131efc59eec4f5e8249d7863f7c087fe0c36a3af860ae not found: ID does not exist" containerID="a002ab58d97d47b88ff131efc59eec4f5e8249d7863f7c087fe0c36a3af860ae" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.752773 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a002ab58d97d47b88ff131efc59eec4f5e8249d7863f7c087fe0c36a3af860ae"} err="failed to get container status \"a002ab58d97d47b88ff131efc59eec4f5e8249d7863f7c087fe0c36a3af860ae\": rpc error: code = NotFound desc = could not find container \"a002ab58d97d47b88ff131efc59eec4f5e8249d7863f7c087fe0c36a3af860ae\": container with ID starting with a002ab58d97d47b88ff131efc59eec4f5e8249d7863f7c087fe0c36a3af860ae not found: ID does not exist" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.752789 4742 scope.go:117] "RemoveContainer" containerID="ed069b4cba40f4015293bd931120de6b0c658093fc94180f036b70c5d5344049" Mar 17 12:26:14 crc kubenswrapper[4742]: E0317 12:26:14.753408 4742 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed069b4cba40f4015293bd931120de6b0c658093fc94180f036b70c5d5344049\": container with ID starting with ed069b4cba40f4015293bd931120de6b0c658093fc94180f036b70c5d5344049 not found: ID does not exist" containerID="ed069b4cba40f4015293bd931120de6b0c658093fc94180f036b70c5d5344049" Mar 17 12:26:14 crc kubenswrapper[4742]: I0317 12:26:14.753433 4742 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed069b4cba40f4015293bd931120de6b0c658093fc94180f036b70c5d5344049"} err="failed to get container status \"ed069b4cba40f4015293bd931120de6b0c658093fc94180f036b70c5d5344049\": rpc error: code = NotFound desc = could not find container \"ed069b4cba40f4015293bd931120de6b0c658093fc94180f036b70c5d5344049\": container with ID starting with ed069b4cba40f4015293bd931120de6b0c658093fc94180f036b70c5d5344049 not found: ID does not exist" Mar 17 12:26:16 crc kubenswrapper[4742]: I0317 12:26:16.677000 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" path="/var/lib/kubelet/pods/6a3b6cbb-6fe3-4264-90fa-e9f12f36b600/volumes" Mar 17 12:26:17 crc kubenswrapper[4742]: I0317 12:26:17.380280 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:17 crc kubenswrapper[4742]: I0317 12:26:17.380377 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:17 crc kubenswrapper[4742]: I0317 12:26:17.427068 4742 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:20 crc kubenswrapper[4742]: I0317 12:26:20.662900 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:26:20 crc kubenswrapper[4742]: E0317 12:26:20.663877 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:26:27 crc kubenswrapper[4742]: I0317 12:26:27.616778 4742 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xgt2l" Mar 17 12:26:27 crc kubenswrapper[4742]: I0317 12:26:27.696437 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xgt2l"] Mar 17 12:26:27 crc kubenswrapper[4742]: I0317 12:26:27.757328 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f2tmr"] Mar 17 12:26:27 crc kubenswrapper[4742]: I0317 12:26:27.757595 4742 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f2tmr" podUID="3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" containerName="registry-server" containerID="cri-o://a24bb3423fd6c614d478fa816e35e28e57a04f304ab6bd2257a2a2a21148e2af" gracePeriod=2 Mar 17 12:26:28 crc kubenswrapper[4742]: I0317 12:26:28.814467 4742 generic.go:334] "Generic (PLEG): container finished" podID="3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" containerID="a24bb3423fd6c614d478fa816e35e28e57a04f304ab6bd2257a2a2a21148e2af" exitCode=0 Mar 17 12:26:28 crc kubenswrapper[4742]: I0317 12:26:28.814538 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2tmr" event={"ID":"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa","Type":"ContainerDied","Data":"a24bb3423fd6c614d478fa816e35e28e57a04f304ab6bd2257a2a2a21148e2af"} Mar 17 12:26:28 crc kubenswrapper[4742]: I0317 12:26:28.815098 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f2tmr" event={"ID":"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa","Type":"ContainerDied","Data":"de81e38d8f33b54616e137f8ba6e9db5c1b64a47fcf10d6c19b779cb6872df1a"} Mar 17 12:26:28 crc kubenswrapper[4742]: I0317 12:26:28.815147 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de81e38d8f33b54616e137f8ba6e9db5c1b64a47fcf10d6c19b779cb6872df1a" Mar 17 12:26:28 crc kubenswrapper[4742]: I0317 12:26:28.817139 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f2tmr" Mar 17 12:26:28 crc kubenswrapper[4742]: I0317 12:26:28.899704 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-catalog-content\") pod \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\" (UID: \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\") " Mar 17 12:26:28 crc kubenswrapper[4742]: I0317 12:26:28.899739 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fjm7\" (UniqueName: \"kubernetes.io/projected/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-kube-api-access-2fjm7\") pod \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\" (UID: \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\") " Mar 17 12:26:28 crc kubenswrapper[4742]: I0317 12:26:28.899920 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-utilities\") pod \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\" (UID: \"3aae4d83-a6a4-440f-b772-a5cb34a9f1fa\") " Mar 17 12:26:28 crc kubenswrapper[4742]: I0317 12:26:28.900412 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-utilities" (OuterVolumeSpecName: "utilities") pod "3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" (UID: "3aae4d83-a6a4-440f-b772-a5cb34a9f1fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:26:28 crc kubenswrapper[4742]: I0317 12:26:28.916165 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-kube-api-access-2fjm7" (OuterVolumeSpecName: "kube-api-access-2fjm7") pod "3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" (UID: "3aae4d83-a6a4-440f-b772-a5cb34a9f1fa"). InnerVolumeSpecName "kube-api-access-2fjm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:26:28 crc kubenswrapper[4742]: I0317 12:26:28.964756 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" (UID: "3aae4d83-a6a4-440f-b772-a5cb34a9f1fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 17 12:26:29 crc kubenswrapper[4742]: I0317 12:26:29.001832 4742 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-utilities\") on node \"crc\" DevicePath \"\"" Mar 17 12:26:29 crc kubenswrapper[4742]: I0317 12:26:29.001881 4742 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 17 12:26:29 crc kubenswrapper[4742]: I0317 12:26:29.001893 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fjm7\" (UniqueName: \"kubernetes.io/projected/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa-kube-api-access-2fjm7\") on node \"crc\" DevicePath \"\"" Mar 17 12:26:29 crc kubenswrapper[4742]: I0317 12:26:29.830856 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f2tmr" Mar 17 12:26:29 crc kubenswrapper[4742]: I0317 12:26:29.882977 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f2tmr"] Mar 17 12:26:29 crc kubenswrapper[4742]: I0317 12:26:29.892853 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f2tmr"] Mar 17 12:26:30 crc kubenswrapper[4742]: I0317 12:26:30.676571 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" path="/var/lib/kubelet/pods/3aae4d83-a6a4-440f-b772-a5cb34a9f1fa/volumes" Mar 17 12:26:33 crc kubenswrapper[4742]: I0317 12:26:33.663081 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:26:33 crc kubenswrapper[4742]: E0317 12:26:33.663628 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:26:48 crc kubenswrapper[4742]: I0317 12:26:48.681131 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:26:48 crc kubenswrapper[4742]: E0317 12:26:48.682045 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:26:58 crc kubenswrapper[4742]: I0317 12:26:58.582701 4742 scope.go:117] "RemoveContainer" containerID="5f19b5d1c55fd11d2c47da31b4e72dbd67065d95f8328d6cd661c226d8841820" Mar 17 12:26:58 crc kubenswrapper[4742]: I0317 12:26:58.624756 4742 scope.go:117] "RemoveContainer" containerID="a24bb3423fd6c614d478fa816e35e28e57a04f304ab6bd2257a2a2a21148e2af" Mar 17 12:26:58 crc kubenswrapper[4742]: I0317 12:26:58.673701 4742 scope.go:117] "RemoveContainer" containerID="4d969a739e06c194141e37ad7215cd875c23d04469edd6a4ddd954975f81fa81" Mar 17 12:26:58 crc kubenswrapper[4742]: I0317 12:26:58.705476 4742 scope.go:117] "RemoveContainer" containerID="3c8d221bed280fb0f266eb63f78a56d3a62938da5d175f727e0c935d0b562039" Mar 17 12:26:58 crc kubenswrapper[4742]: I0317 12:26:58.785718 4742 scope.go:117] "RemoveContainer" containerID="2f7b501fee989bcec84dd1b6d910b523354603980aae8d86706aab0522b31a74" Mar 17 12:27:03 crc kubenswrapper[4742]: I0317 12:27:03.663287 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:27:03 crc kubenswrapper[4742]: E0317 12:27:03.664347 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:27:15 crc kubenswrapper[4742]: I0317 12:27:15.663146 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:27:15 crc kubenswrapper[4742]: E0317 12:27:15.663996 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:27:28 crc kubenswrapper[4742]: I0317 12:27:28.669860 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:27:28 crc kubenswrapper[4742]: E0317 12:27:28.670777 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:27:41 crc kubenswrapper[4742]: I0317 12:27:41.663384 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:27:41 crc kubenswrapper[4742]: E0317 12:27:41.664466 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:27:54 crc kubenswrapper[4742]: I0317 12:27:54.664097 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:27:54 crc kubenswrapper[4742]: E0317 12:27:54.665213 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.165823 4742 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29562508-8gknr"] Mar 17 12:28:00 crc kubenswrapper[4742]: E0317 12:28:00.167023 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" containerName="extract-utilities" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.167046 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" containerName="extract-utilities" Mar 17 12:28:00 crc kubenswrapper[4742]: E0317 12:28:00.167069 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d769ba-43cf-4abc-aec6-f21879cc4c38" containerName="gather" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.167080 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d769ba-43cf-4abc-aec6-f21879cc4c38" containerName="gather" Mar 17 12:28:00 crc kubenswrapper[4742]: E0317 12:28:00.167102 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" containerName="registry-server" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.167115 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" containerName="registry-server" Mar 17 12:28:00 crc kubenswrapper[4742]: E0317 12:28:00.167144 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d769ba-43cf-4abc-aec6-f21879cc4c38" containerName="copy" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.167154 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d769ba-43cf-4abc-aec6-f21879cc4c38" containerName="copy" Mar 17 12:28:00 crc kubenswrapper[4742]: E0317 12:28:00.167172 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" containerName="registry-server" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.167182 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" containerName="registry-server" Mar 17 12:28:00 crc kubenswrapper[4742]: E0317 12:28:00.167220 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" containerName="extract-utilities" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.167231 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" containerName="extract-utilities" Mar 17 12:28:00 crc kubenswrapper[4742]: E0317 12:28:00.167255 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" containerName="extract-content" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.167268 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" containerName="extract-content" Mar 17 12:28:00 crc kubenswrapper[4742]: E0317 12:28:00.167279 4742 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" containerName="extract-content" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.167289 4742 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" containerName="extract-content" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.167553 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d769ba-43cf-4abc-aec6-f21879cc4c38" containerName="gather" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.167581 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a3b6cbb-6fe3-4264-90fa-e9f12f36b600" containerName="registry-server" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.167596 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aae4d83-a6a4-440f-b772-a5cb34a9f1fa" containerName="registry-server" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.167629 4742 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d769ba-43cf-4abc-aec6-f21879cc4c38" containerName="copy" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.168516 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562508-8gknr" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.173741 4742 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2krlk" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.174456 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.179828 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562508-8gknr"] Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.182146 4742 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.291440 4742 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8kjs\" (UniqueName: \"kubernetes.io/projected/de4eb604-d5a1-4967-ad55-d4ee9244e613-kube-api-access-x8kjs\") pod \"auto-csr-approver-29562508-8gknr\" (UID: \"de4eb604-d5a1-4967-ad55-d4ee9244e613\") " pod="openshift-infra/auto-csr-approver-29562508-8gknr" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.393102 4742 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8kjs\" (UniqueName: \"kubernetes.io/projected/de4eb604-d5a1-4967-ad55-d4ee9244e613-kube-api-access-x8kjs\") pod \"auto-csr-approver-29562508-8gknr\" (UID: \"de4eb604-d5a1-4967-ad55-d4ee9244e613\") " pod="openshift-infra/auto-csr-approver-29562508-8gknr" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.426541 4742 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8kjs\" (UniqueName: \"kubernetes.io/projected/de4eb604-d5a1-4967-ad55-d4ee9244e613-kube-api-access-x8kjs\") pod \"auto-csr-approver-29562508-8gknr\" (UID: \"de4eb604-d5a1-4967-ad55-d4ee9244e613\") " pod="openshift-infra/auto-csr-approver-29562508-8gknr" Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.503036 4742 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562508-8gknr" Mar 17 12:28:00 crc kubenswrapper[4742]: W0317 12:28:00.973276 4742 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde4eb604_d5a1_4967_ad55_d4ee9244e613.slice/crio-c14185dbd3696625cd7cfad3082988fe753b743c5b0630dfc36c1201bbd61f64 WatchSource:0}: Error finding container c14185dbd3696625cd7cfad3082988fe753b743c5b0630dfc36c1201bbd61f64: Status 404 returned error can't find the container with id c14185dbd3696625cd7cfad3082988fe753b743c5b0630dfc36c1201bbd61f64 Mar 17 12:28:00 crc kubenswrapper[4742]: I0317 12:28:00.978092 4742 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29562508-8gknr"] Mar 17 12:28:01 crc kubenswrapper[4742]: I0317 12:28:01.957552 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562508-8gknr" event={"ID":"de4eb604-d5a1-4967-ad55-d4ee9244e613","Type":"ContainerStarted","Data":"c14185dbd3696625cd7cfad3082988fe753b743c5b0630dfc36c1201bbd61f64"} Mar 17 12:28:02 crc kubenswrapper[4742]: I0317 12:28:02.974233 4742 generic.go:334] "Generic (PLEG): container finished" podID="de4eb604-d5a1-4967-ad55-d4ee9244e613" containerID="53b6bffc57596847151e2c297e376c3c4c7fa1fbfad70f612cc381639b86f252" exitCode=0 Mar 17 12:28:02 crc kubenswrapper[4742]: I0317 12:28:02.974337 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562508-8gknr" event={"ID":"de4eb604-d5a1-4967-ad55-d4ee9244e613","Type":"ContainerDied","Data":"53b6bffc57596847151e2c297e376c3c4c7fa1fbfad70f612cc381639b86f252"} Mar 17 12:28:04 crc kubenswrapper[4742]: I0317 12:28:04.379045 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562508-8gknr" Mar 17 12:28:04 crc kubenswrapper[4742]: I0317 12:28:04.482531 4742 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8kjs\" (UniqueName: \"kubernetes.io/projected/de4eb604-d5a1-4967-ad55-d4ee9244e613-kube-api-access-x8kjs\") pod \"de4eb604-d5a1-4967-ad55-d4ee9244e613\" (UID: \"de4eb604-d5a1-4967-ad55-d4ee9244e613\") " Mar 17 12:28:04 crc kubenswrapper[4742]: I0317 12:28:04.491204 4742 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4eb604-d5a1-4967-ad55-d4ee9244e613-kube-api-access-x8kjs" (OuterVolumeSpecName: "kube-api-access-x8kjs") pod "de4eb604-d5a1-4967-ad55-d4ee9244e613" (UID: "de4eb604-d5a1-4967-ad55-d4ee9244e613"). InnerVolumeSpecName "kube-api-access-x8kjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 12:28:04 crc kubenswrapper[4742]: I0317 12:28:04.586012 4742 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8kjs\" (UniqueName: \"kubernetes.io/projected/de4eb604-d5a1-4967-ad55-d4ee9244e613-kube-api-access-x8kjs\") on node \"crc\" DevicePath \"\"" Mar 17 12:28:05 crc kubenswrapper[4742]: I0317 12:28:05.006470 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29562508-8gknr" event={"ID":"de4eb604-d5a1-4967-ad55-d4ee9244e613","Type":"ContainerDied","Data":"c14185dbd3696625cd7cfad3082988fe753b743c5b0630dfc36c1201bbd61f64"} Mar 17 12:28:05 crc kubenswrapper[4742]: I0317 12:28:05.006528 4742 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c14185dbd3696625cd7cfad3082988fe753b743c5b0630dfc36c1201bbd61f64" Mar 17 12:28:05 crc kubenswrapper[4742]: I0317 12:28:05.006539 4742 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29562508-8gknr" Mar 17 12:28:05 crc kubenswrapper[4742]: I0317 12:28:05.483351 4742 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29562502-mqx27"] Mar 17 12:28:05 crc kubenswrapper[4742]: I0317 12:28:05.508807 4742 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29562502-mqx27"] Mar 17 12:28:06 crc kubenswrapper[4742]: I0317 12:28:06.681638 4742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c09ddacc-c7ba-4eae-afd2-dc4ad528c497" path="/var/lib/kubelet/pods/c09ddacc-c7ba-4eae-afd2-dc4ad528c497/volumes" Mar 17 12:28:07 crc kubenswrapper[4742]: I0317 12:28:07.662672 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:28:07 crc kubenswrapper[4742]: E0317 12:28:07.662982 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:28:21 crc kubenswrapper[4742]: I0317 12:28:21.662599 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:28:21 crc kubenswrapper[4742]: E0317 12:28:21.665262 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:28:33 crc kubenswrapper[4742]: I0317 12:28:33.664156 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:28:33 crc kubenswrapper[4742]: E0317 12:28:33.665234 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:28:45 crc kubenswrapper[4742]: I0317 12:28:45.663346 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:28:45 crc kubenswrapper[4742]: E0317 12:28:45.664520 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:28:58 crc kubenswrapper[4742]: I0317 12:28:58.969285 4742 scope.go:117] "RemoveContainer" containerID="b73bf47f612cd79dab2317473f033cfcc59a52b6a0baf462b57bcd87d5c4dd23" Mar 17 12:29:00 crc kubenswrapper[4742]: I0317 12:29:00.663859 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:29:00 crc kubenswrapper[4742]: E0317 12:29:00.664469 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:29:11 crc kubenswrapper[4742]: I0317 12:29:11.662643 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:29:11 crc kubenswrapper[4742]: E0317 12:29:11.663337 4742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5jxxw_openshift-machine-config-operator(5e11ad39-38bb-4b70-9cac-ce078b37f882)\"" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" podUID="5e11ad39-38bb-4b70-9cac-ce078b37f882" Mar 17 12:29:25 crc kubenswrapper[4742]: I0317 12:29:25.663753 4742 scope.go:117] "RemoveContainer" containerID="ff68d146ee7e54000271ca9db2a2d3738a45a22d373dad646c496024915d0bf0" Mar 17 12:29:26 crc kubenswrapper[4742]: I0317 12:29:26.910622 4742 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5jxxw" event={"ID":"5e11ad39-38bb-4b70-9cac-ce078b37f882","Type":"ContainerStarted","Data":"22f7670af91427af205471823471b68790064a1f6599d9671ace1a7ca7269266"}